使用Java声音API播放MP3

使用Java声音API播放MP3,java,audio,javasound,Java,Audio,Javasound,你能建议我怎样写一首能播放歌曲的曲子吗 我尝试了以下代码段,但发现此异常: import sun.audio.*; import java.io.*; class tester { public static void main(String args[]) throws Exception { InputStream in=new FileInputStream("tester.mp3"); AudioStream as=new AudioStream(in); AudioPl

你能建议我怎样写一首能播放歌曲的曲子吗

我尝试了以下代码段,但发现此异常:

import sun.audio.*;
import java.io.*;

class tester {
 public static void main(String args[]) throws Exception {
  InputStream in=new FileInputStream("tester.mp3");
  AudioStream as=new AudioStream(in);
  AudioPlayer.player.start(as);
 }
}

我不相信java AudioStream类本机支持mp3。据了解,它只支持AIFF、AU和WAV。你必须考虑使用另一个库来加载和播放mp3文件


介绍如何在Java中播放mp3文件Sun Sound API不播放mp3

当我需要它时,我使用JavaZOOM jLayer库(http://www.javazoom.net/javalayer/javalayer.html)


要想播放mp3,您仍然需要添加MP3SPI(随jLayer提供)

如前所述,默认情况下,Java Sound不支持MP3。对于它在任何特定JRE中支持的类型,请选中
AudioSystem.getAudioFileTypes()

添加对读取MP3的支持的一种方法是将基于JMF的1添加到应用程序的运行时类路径中

  • 众所周知,这种联系并不完全可靠。这个罐子也可以在我的网站上买到
  • 至于你的实际问题,虽然a似乎非常适合这类任务,但不幸的是,它只能容纳一秒钟的立体声、16位、44.1KHz的声音。这就是为什么我开发了
    BigClip
    。(它也有自己的循环问题,如果您解决了这些问题,请报告。)

    package org.pscode.xui.sound.bigclip;
    导入java.awt.Component;
    导入javax.swing.*;
    //J2SE 1.3
    导入javax.sound.sampled.*;
    导入java.io.*;
    //J2SE 1.4
    导入java.util.logging.*;
    导入java.util.array;
    /**设计的javax.sound.sampled.Clip的一个实现
    处理任意大小的剪辑,仅受内存量限制
    可用于应用程序。它使用Post1.4线程行为(守护进程线程)
    这将在主电源退出后停止声音运行。
    
    • 2012-07-24-修复了字节数组大小的错误(2^16->(int)Math.pow(2,16))。
    • 2009-09-01-通过调用drain()(之前)修复了在结尾处剪辑了clip..的错误 播放循环完成后在数据线上调用stop()。框架的改进 和微秒位置确定。
    • 2009-08-17-添加了接受剪辑的方便构造函数。换了私人房间 convertFrameToM..秒从“微”到“毫”的方法,以反映他们正在交易 单位为千分之一秒。
    • 2009年8月14日-在声音循环后去掉了flush(),因为它刚刚切断了音轨 并被发现不需要它的快进/快退功能 被引入以支持。
    • 2009年8月11日-第一个二进制版本。
    注意:删除1.3中使用的@Override符号和日志记录+ @从1.5开始 @版本2009-08-17 @作家安德鲁·汤普森*/ 公共类BigClip实现Clip、LineListener{ /**此剪辑使用的数据线*/ 专用源数据线数据线; /**音频数据的原始字节*/ 专用字节[]音频数据; /**音频数据的流包装器*/ 私有ByteArrayInputStream输入流; /**由调用代码设置的循环计数*/ 私有整数循环计数; /**循环次数的内部计数*/ 私人倒计时; /**循环点的起点。默认值为0*/ 私有int-loopPointStart; /**循环点的终点。默认为剪辑的终点*/ 私有int-loopPointEnd; /**存储剪辑的当前帧位置*/ 私人位置; /**用于运行()声音的线程*/ 私有线程; /**声音当前是否正在播放或处于活动状态*/ 私有布尔活动; /**存储上次将字节转储到音频流的时间*/ 私人长时间定位集; 私有int bufferUpdateFactor=2; /**加载进度对话框的父组件*/ 组件父级=null; /**用于报告消息*/ 私有记录器=Logger.getAnonymousLogger(); /**BigClip的默认构造函数。不执行任何操作。来自 在open()中传递的AudioInputStream将用于获取适当的SourceDataLine*/ 公共BigClip(){} /**有许多AudioSystem方法将返回已配置的剪辑 便利构造函数允许我们为使用 与原始剪辑相同的音频格式。 @param clip clip用于配置BigClip的片段*/ 公共BigClip(剪辑剪辑)引发LineUnavailableException{ dataLine=AudioSystem.getSourceDataLine(clip.getFormat()); } /**提供此剪辑的整个音频缓冲区。 @return audioData byte[]此剪辑中加载的音频数据字节*/ 公共字节[]getAudioData(){ 返回音频数据; } /**将父组件设置为“加载轨迹…”进度对话框的所有者。 如果为空,则不会显示任何进度*/ 公共无效setParentComponent(组件父级){ this.parent=parent; } /**将帧计数转换为以毫秒为单位的持续时间*/ 专用长ConvertFramesTo毫秒(整数帧){ 返回(帧/(长)数据线.getFormat().getSampleRate())*1000; } /**将持续时间(以毫秒为单位)转换为帧计数*/ 私有整数转换毫秒帧(长毫秒){ 返回(int)(毫秒/dataLine.getFormat().getSampleRate()); } @凌驾 公共无效更新(LineEvent le){ logger.log(Level.FINEST,“更新:+le”); } @凌驾 公共无效循环(整数计数){ logger.log(Level.FINEST,“循环(“+count+”)-framePosition:“+framePosition”); 循环计数=计数; 倒计时=计数; 主动=真; inputStream.reset(); start(); } @凌驾 公共无效设置环点(int开始,int结束){ 如果( startaudioData.length-1|| endaudioData.length ) { 抛出新的I
    package org.pscode.xui.sound.bigclip;
    
    import java.awt.Component;
    import javax.swing.*;
    
    // J2SE 1.3
    import javax.sound.sampled.*;
    
    import java.io.*;
    
    // J2SE 1.4
    import java.util.logging.*;
    
    import java.util.Arrays;
    
    /** An implementation of the javax.sound.sampled.Clip that is designed
    to handle Clips of arbitrary size, limited only by the amount of memory
    available to the app.    It uses the post 1.4 thread behaviour (daemon thread)
    that will stop the sound running after the main has exited.
    <ul>
    <li>2012-07-24 - Fixed bug in size of byte array (2^16 -> (int)Math.pow(2, 16)).
    <li>2009-09-01 - Fixed bug that had clip ..clipped at the end, by calling drain() (before
    calling stop()) on the dataline after the play loop was complete. Improvement to frame
    and microsecond position determination.
    <li>2009-08-17 - added convenience constructor that accepts a Clip. Changed the private
    convertFrameToM..seconds methods from 'micro' to 'milli' to reflect that they were dealing
    with units of 1000/th of a second.
    <li>2009-08-14 - got rid of flush() after the sound loop, as it was cutting off tracks just
    before the end, and was found to be not needed for the fast-forward/rewind functionality it
    was introduced to support.
    <li>2009-08-11 - First binary release.
    </ul>
    N.B. Remove @Override notation and logging to use in 1.3+
    @since 1.5
    @version 2009-08-17
    @author Andrew Thompson */
    public class BigClip implements Clip, LineListener {
    
        /** The DataLine used by this Clip. */
        private SourceDataLine dataLine;
    
        /** The raw bytes of the audio data. */
        private byte[] audioData;
    
        /** The stream wrapper for the audioData. */
        private ByteArrayInputStream inputStream;
    
        /** Loop count set by the calling code. */
        private int loopCount;
        /** Internal count of how many loops to go. */
        private int countDown;
        /** The start of a loop point.    Defaults to 0. */
        private int loopPointStart;
        /** The end of a loop point.    Defaults to the end of the Clip. */
        private int loopPointEnd;
    
        /** Stores the current frame position of the clip. */
        private int framePosition;
    
        /** Thread used to run() sound. */
        private Thread thread;
        /** Whether the sound is currently playing or active. */
        private boolean active;
        /** Stores the last time bytes were dumped to the audio stream. */
        private long timelastPositionSet;
    
        private int bufferUpdateFactor = 2;
    
        /** The parent Component for the loading progress dialog.    */
        Component parent = null;
    
        /** Used for reporting messages. */
        private Logger logger = Logger.getAnonymousLogger();
    
        /** Default constructor for a BigClip.    Does nothing.    Information from the
        AudioInputStream passed in open() will be used to get an appropriate SourceDataLine. */
        public BigClip() {}
    
        /** There are a number of AudioSystem methods that will return a configured Clip.    This
        convenience constructor allows us to obtain a SourceDataLine for the BigClip that uses
        the same AudioFormat as the original Clip.
        @param clip Clip The Clip used to configure the BigClip. */
        public BigClip(Clip clip) throws LineUnavailableException {
            dataLine = AudioSystem.getSourceDataLine( clip.getFormat() );
        }
    
        /** Provides the entire audio buffer of this clip.
        @return audioData byte[] The bytes of the audio data that is loaded in this Clip. */
        public byte[] getAudioData() {
            return audioData;
        }
    
        /** Sets a parent component to act as owner of a "Loading track.." progress dialog.
        If null, there will be no progress shown. */
        public void setParentComponent(Component parent) {
            this.parent = parent;
        }
    
        /** Converts a frame count to a duration in milliseconds. */
        private long convertFramesToMilliseconds(int frames) {
            return (frames/(long)dataLine.getFormat().getSampleRate())*1000;
        }
    
        /** Converts a duration in milliseconds to a frame count. */
        private int convertMillisecondsToFrames(long milliseconds) {
            return (int)(milliseconds/dataLine.getFormat().getSampleRate());
        }
    
        @Override
        public void update(LineEvent le) {
            logger.log(Level.FINEST, "update: " + le );
        }
    
        @Override
        public void loop(int count) {
            logger.log(Level.FINEST, "loop(" + count + ") - framePosition: " + framePosition);
            loopCount = count;
            countDown = count;
            active = true;
            inputStream.reset();
    
            start();
        }
    
        @Override
        public void setLoopPoints(int start, int end) {
            if (
                start<0 ||
                start>audioData.length-1 ||
                end<0 ||
                end>audioData.length
                ) {
                throw new IllegalArgumentException(
                    "Loop points '" +
                    start +
                    "' and '" +
                    end +
                    "' cannot be set for buffer of size " +
                    audioData.length);
            }
            if (start>end) {
                throw new IllegalArgumentException(
                    "End position " +
                    end +
                    " preceeds start position " + start);
            }
    
            loopPointStart = start;
            framePosition = loopPointStart;
            loopPointEnd = end;
        }
    
        @Override
        public void setMicrosecondPosition(long milliseconds) {
            framePosition = convertMillisecondsToFrames(milliseconds);
        }
    
        @Override
        public long getMicrosecondPosition() {
            return convertFramesToMilliseconds(getFramePosition());
        }
    
        @Override
        public long getMicrosecondLength() {
            return convertFramesToMilliseconds(getFrameLength());
        }
    
        @Override
        public void setFramePosition(int frames) {
            framePosition = frames;
            int offset = framePosition*format.getFrameSize();
            try {
                inputStream.reset();
                inputStream.read(new byte[offset]);
            } catch(Exception e) {
                e.printStackTrace();
            }
        }
    
        @Override
        public int getFramePosition() {
            long timeSinceLastPositionSet = System.currentTimeMillis() - timelastPositionSet;
            int size = dataLine.getBufferSize()*(format.getChannels()/2)/bufferUpdateFactor;
            int framesSinceLast = (int)((timeSinceLastPositionSet/1000f)*
                dataLine.getFormat().getFrameRate());
            int framesRemainingTillTime = size - framesSinceLast;
            return framePosition
                - framesRemainingTillTime;
        }
    
        @Override
        public int getFrameLength() {
            return audioData.length/format.getFrameSize();
        }
    
        AudioFormat format;
    
        @Override
        public void open(AudioInputStream stream) throws
            IOException,
            LineUnavailableException {
    
            AudioInputStream is1;
            format = stream.getFormat();
    
            if (format.getEncoding()!=AudioFormat.Encoding.PCM_SIGNED) {
                is1 = AudioSystem.getAudioInputStream(
                    AudioFormat.Encoding.PCM_SIGNED, stream );
            } else {
                is1 = stream;
            }
            format = is1.getFormat();
            InputStream is2;
            if (parent!=null) {
                ProgressMonitorInputStream pmis = new ProgressMonitorInputStream(
                    parent,
                    "Loading track..",
                    is1);
                pmis.getProgressMonitor().setMillisToPopup(0);
                is2 = pmis;
            } else {
                is2 = is1;
            }
    
            byte[] buf = new byte[ (int)Math.pow(2, 16) ];
            int totalRead = 0;
            int numRead = 0;
            ByteArrayOutputStream baos = new ByteArrayOutputStream();
            numRead = is2.read( buf );
            while (numRead>-1) {
                baos.write( buf, 0, numRead );
                numRead = is2.read( buf, 0, buf.length );
                totalRead += numRead;
            }
            is2.close();
            audioData = baos.toByteArray();
            AudioFormat afTemp;
            if (format.getChannels()<2) {
                afTemp = new AudioFormat(
                    format.getEncoding(),
                    format.getSampleRate(),
                    format.getSampleSizeInBits(),
                    2,
                    format.getSampleSizeInBits()*2/8, // calculate frame size
                    format.getFrameRate(),
                    format.isBigEndian()
                    );
            } else {
                afTemp = format;
            }
    
            setLoopPoints(0,audioData.length);
            dataLine = AudioSystem.getSourceDataLine(afTemp);
            dataLine.open();
            inputStream = new ByteArrayInputStream( audioData );
        }
    
        @Override
        public void open(AudioFormat format,
            byte[] data,
            int offset,
            int bufferSize)
            throws LineUnavailableException {
            byte[] input = new byte[bufferSize];
            for (int ii=0; ii<input.length; ii++) {
                input[ii] = data[offset+ii];
            }
            ByteArrayInputStream inputStream = new ByteArrayInputStream(input);
            try {
                AudioInputStream ais1 = AudioSystem.getAudioInputStream(inputStream);
                AudioInputStream ais2 = AudioSystem.getAudioInputStream(format, ais1);
                open(ais2);
            } catch( UnsupportedAudioFileException uafe ) {
                throw new IllegalArgumentException(uafe);
            } catch( IOException ioe ) {
                throw new IllegalArgumentException(ioe);
            }
            // TODO    -    throw IAE for invalid frame size, format.
        }
    
        @Override
        public float getLevel() {
            return dataLine.getLevel();
        }
    
        @Override
        public long getLongFramePosition() {
            return dataLine.getLongFramePosition()*2/format.getChannels();
        }
    
        @Override
        public int available() {
            return dataLine.available();
        }
    
        @Override
        public int getBufferSize() {
            return dataLine.getBufferSize();
        }
    
        @Override
        public AudioFormat getFormat() {
            return format;
        }
    
        @Override
        public boolean isActive() {
            return dataLine.isActive();
        }
    
        @Override
        public boolean isRunning() {
            return dataLine.isRunning();
        }
    
        @Override
        public boolean isOpen() {
            return dataLine.isOpen();
        }
    
        @Override
        public void stop() {
            logger.log(Level.FINEST, "BigClip.stop()");
            active = false;
            // why did I have this commented out?
            dataLine.stop();
            if (thread!=null) {
                try {
                    active = false;
                    thread.join();
                } catch(InterruptedException wakeAndContinue) {
                }
            }
        }
    
        public byte[] convertMonoToStereo(byte[] data, int bytesRead) {
            byte[] tempData = new byte[bytesRead*2];
            if (format.getSampleSizeInBits()==8) {
                for(int ii=0; ii<bytesRead; ii++) {
                    byte b = data[ii];
                    tempData[ii*2] = b;
                    tempData[ii*2+1] = b;
                }
            } else {
                for(int ii=0; ii<bytesRead-1; ii+=2) {
                    //byte b2 = is2.read();
                    byte b1 = data[ii];
                    byte b2 = data[ii+1];
                    tempData[ii*2] = b1;
                    tempData[ii*2+1] = b2;
                    tempData[ii*2+2] = b1;
                    tempData[ii*2+3] = b2;
                }
            }
            return tempData;
        }
    
        boolean fastForward;
        boolean fastRewind;
    
        public void setFastForward(boolean fastForward) {
            logger.log(Level.FINEST, "FastForward " + fastForward);
            this.fastForward = fastForward;
            fastRewind = false;
            flush();
        }
    
        public boolean getFastForward() {
            return fastForward;
        }
    
        public void setFastRewind(boolean fastRewind) {
            logger.log(Level.FINEST, "FastRewind " + fastRewind);
            this.fastRewind = fastRewind;
            fastForward = false;
            flush();
        }
    
        public boolean getFastRewind() {
            return fastRewind;
        }
    
        /** TODO - fix bug in LOOP_CONTINUOUSLY */
        @Override
        public void start() {
            Runnable r = new Runnable() {
                public void run() {
                    try {
                        /* Should these open()/close() calls be here, or explicitly
                        called by user program?    The JavaDocs for line suggest that
                        Clip should throw an IllegalArgumentException, so we'll
                        stick with that and call it explicitly. */
                        dataLine.open();
    
                        dataLine.start();
                        int bytesRead = 0;
                        int frameSize = dataLine.getFormat().getFrameSize();
                        int bufSize = dataLine.getBufferSize();
                        boolean startOrMove = true;
                        byte[] data = new byte[bufSize];
                        int offset = framePosition*frameSize;
                        int totalBytes = offset;
                        inputStream.read(new byte[offset], 0, offset);
                        logger.log(Level.FINEST, "loopCount " + loopCount );
                        while ((bytesRead = inputStream.read(data,0,data.length))
                            != -1 &&
                            (loopCount==Clip.LOOP_CONTINUOUSLY ||
                            countDown>0) &&
                            active ) {
                            logger.log(Level.FINEST,
                                "BigClip.start() loop " + framePosition );
                            totalBytes += bytesRead;
                            int framesRead;
                            byte[] tempData;
                            if (format.getChannels()<2) {
                                tempData = convertMonoToStereo(data, bytesRead);
                                framesRead = bytesRead/
                                    format.getFrameSize();
                                bytesRead*=2;
                            } else {
                                framesRead = bytesRead/
                                    dataLine.getFormat().getFrameSize();
                                tempData = Arrays.copyOfRange(data, 0, bytesRead);
                            }
                            framePosition += framesRead;
                            if (framePosition>=loopPointEnd) {
                                framePosition = loopPointStart;
                                inputStream.reset();
                                countDown--;
                                logger.log(Level.FINEST,
                                    "Loop Count: " + countDown );
                            }
                            timelastPositionSet = System.currentTimeMillis();
                            byte[] newData;
                            if (fastForward) {
                                newData = getEveryNthFrame(tempData, 2);
                            } else if (fastRewind) {
                                byte[] temp = getEveryNthFrame(tempData, 2);
                                newData = reverseFrames(temp);
                                inputStream.reset();
                                totalBytes -= 2*bytesRead;
                            framePosition -= 2*framesRead;
                                if (totalBytes<0) {
                                    setFastRewind(false);
                                    totalBytes = 0;
                                }
                                inputStream.skip(totalBytes);
                                logger.log(Level.INFO, "totalBytes " + totalBytes);
                            } else {
                                newData = tempData;
                            }
                            dataLine.write(newData, 0, newData.length);
                            if (startOrMove) {
                                data = new byte[bufSize/
                                    bufferUpdateFactor];
                                startOrMove = false;
                            }
                        }
                        logger.log(Level.FINEST,
                            "BigClip.start() loop ENDED" + framePosition );
                        active = false;
                        dataLine.drain();
                        dataLine.stop();
                        /* should these open()/close() be here, or explicitly
                        called by user program? */
                        dataLine.close();
                    } catch (LineUnavailableException lue) {
                        logger.log( Level.SEVERE,
                            "No sound line available!", lue );
                        if (parent!=null) {
                            JOptionPane.showMessageDialog(
                                parent,
                                "Clear the sound lines to proceed",
                                "No audio lines available!",
                                JOptionPane.ERROR_MESSAGE);
                        }
                    }
                }
            };
            thread= new Thread(r);
            // makes thread behaviour compatible with JavaSound post 1.4
            thread.setDaemon(true);
            thread.start();
        }
    
        /** Assume the frame size is 4. */
        public byte[] reverseFrames(byte[] data) {
            byte[] reversed = new byte[data.length];
            byte[] frame = new byte[4];
    
            for (int ii=0; ii<data.length/4; ii++) {
                int first = (data.length)-((ii+1)*4)+0;
                int last = (data.length)-((ii+1)*4)+3;
                frame[0] = data[first];
                frame[1] = data[(data.length)-((ii+1)*4)+1];
                frame[2] = data[(data.length)-((ii+1)*4)+2];
                frame[3] = data[last];
    
                reversed[ii*4+0] = frame[0];
                reversed[ii*4+1] = frame[1];
                reversed[ii*4+2] = frame[2];
                reversed[ii*4+3] = frame[3];
                if (ii<5 || ii>(data.length/4)-5) {
                    logger.log(Level.FINER, "From \t" + first + " \tlast " + last );
                    logger.log(Level.FINER, "To \t" + ((ii*4)+0) + " \tlast " + ((ii*4)+3) );
                }
            }
    
    /*
            for (int ii=0; ii<data.length; ii++) {
                reversed[ii] = data[data.length-1-ii];
            }
    */
    
            return reversed;
        }
    
        /** Assume the frame size is 4. */
        public byte[] getEveryNthFrame(byte[] data, int skip) {
            int length = data.length/skip;
            length = (length/4)*4;
            logger.log(Level.FINEST, "length " + data.length + " \t" + length);
            byte[] b = new byte[length];
            //byte[] frame = new byte[4];
            for (int ii=0; ii<b.length/4; ii++) {
                b[ii*4+0] = data[ii*skip*4+0];
                b[ii*4+1] = data[ii*skip*4+1];
                b[ii*4+2] = data[ii*skip*4+2];
                b[ii*4+3] = data[ii*skip*4+3];
            }
            return b;
        }
    
        @Override
        public void flush() {
            dataLine.flush();
        }
    
        @Override
        public void drain() {
            dataLine.drain();
        }
    
        @Override
        public void removeLineListener(LineListener listener) {
            dataLine.removeLineListener(listener);
        }
    
        @Override
        public void addLineListener(LineListener listener) {
            dataLine.addLineListener(listener);
        }
    
        @Override
        public Control getControl(Control.Type control) {
            return dataLine.getControl(control);
        }
    
        @Override
        public Control[] getControls() {
            if (dataLine==null) {
                return new Control[0];
            } else {
                return dataLine.getControls();
            }
        }
    
        @Override
        public boolean isControlSupported(Control.Type control) {
            return dataLine.isControlSupported(control);
        }
    
        @Override
        public void close() {
            dataLine.close();
        }
    
        @Override
        public void open() throws LineUnavailableException {
            throw new IllegalArgumentException("illegal call to open() in interface Clip");
        }
    
        @Override
        public Line.Info getLineInfo() {
            return dataLine.getLineInfo();
        }
    
        /** Determines the single largest sample size of all channels of the current clip.
        This can be handy for determining a fraction to scal visual representations.
        @return Double between 0 & 1 representing the maximum signal level of any channel. */
        public double getLargestSampleSize() {
    
            int largest = 0;
            int current;
    
            boolean signed = (format.getEncoding()==AudioFormat.Encoding.PCM_SIGNED);
            int bitDepth = format.getSampleSizeInBits();
            boolean bigEndian = format.isBigEndian();
    
            int samples = audioData.length*8/bitDepth;
    
            if (signed) {
                if (bitDepth/8==2) {
                    if (bigEndian) {
                        for (int cc = 0; cc < samples; cc++) {
                            current = (audioData[cc*2]*256 + (audioData[cc*2+1] & 0xFF));
                            if (Math.abs(current)>largest) {
                                largest = Math.abs(current);
                            }
                        }
                    } else {
                        for (int cc = 0; cc < samples; cc++) {
                            current = (audioData[cc*2+1]*256 + (audioData[cc*2] & 0xFF));
                            if (Math.abs(current)>largest) {
                                largest = Math.abs(current);
                            }
                        }
                    }
                } else {
                    for (int cc = 0; cc < samples; cc++) {
                        current = (audioData[cc] & 0xFF);
                        if (Math.abs(current)>largest) {
                            largest = Math.abs(current);
                        }
                    }
                }
            } else {
                if (bitDepth/8==2) {
                    if (bigEndian) {
                        for (int cc = 0; cc < samples; cc++) {
                            current = (audioData[cc*2]*256 + (audioData[cc*2+1] - 0x80));
                            if (Math.abs(current)>largest) {
                                largest = Math.abs(current);
                            }
                        }
                    } else {
                        for (int cc = 0; cc < samples; cc++) {
                            current = (audioData[cc*2+1]*256 + (audioData[cc*2] - 0x80));
                            if (Math.abs(current)>largest) {
                                largest = Math.abs(current);
                            }
                        }
                    }
                } else {
                    for (int cc = 0; cc < samples; cc++) {
                        if ( audioData[cc]>0 ) {
                            current = (audioData[cc] - 0x80);
                            if (Math.abs(current)>largest) {
                                largest = Math.abs(current);
                            }
                        } else {
                            current = (audioData[cc] + 0x80);
                            if (Math.abs(current)>largest) {
                                largest = Math.abs(current);
                            }
                        }
                    }
                }
            }
    
            // audioData
            logger.log(Level.FINEST, "Max signal level: " + (double)largest/(Math.pow(2, bitDepth-1)));
            return (double)largest/(Math.pow(2, bitDepth-1));
        }
    }
    
           String file="SOME//SONGFILE//PATH";
           FileInputStream fis     = new FileInputStream(file);
           BufferedInputStream bis = new BufferedInputStream(fis);
           player = new Player(bis);
    
           player.play();