You are previewing HTML5 Canvas.

HTML5 Canvas

Cover of HTML5 Canvas by Steve Fulton... Published by O'Reilly Media, Inc.
  1. HTML5 Canvas
    1. SPECIAL OFFER: Upgrade this ebook with O’Reilly
    2. A Note Regarding Supplemental Files
    3. Preface
      1. Running the Examples in the Book
      2. What You Need to Know
      3. How This Book Is Organized
      4. Conventions Used in This Book
      5. Using Code Examples
      6. We’d Like to Hear from You
      7. Safari® Books Online
      8. Acknowledgments
    4. 1. Introduction to HTML5 Canvas
      1. The Basic HTML Page
      2. Basic HTML We Will Use in This Book
      3. The Document Object Model (DOM) and Canvas
      4. JavaScript and Canvas
      5. HTML5 Canvas “Hello World!”
      6. Debugging with Console.log
      7. The 2D Context and the Current State
      8. The HTML5 Canvas Object
      9. Another Example: Guess The Letter
      10. What’s Next
    5. 2. Drawing on the Canvas
      1. The Basic File Setup for This Chapter
      2. The Basic Rectangle Shape
      3. The Canvas State
      4. Using Paths to Create Lines
      5. Advanced Path Methods
      6. Compositing on the Canvas
      7. Simple Canvas Transformations
      8. Filling Objects with Colors and Gradients
      9. Filling Shapes with Patterns
      10. Creating Shadows on Canvas Shapes
      11. What’s Next
    6. 3. The HTML5 Canvas Text API
      1. Displaying Basic Text
      2. Setting the Text Font
      3. Text and the Canvas Context
      4. Text with Gradients and Patterns
      5. Width, Height, Scale, and toDataURL() Revisited
      6. Final Version of Text Arranger
      7. What’s Next
    7. 4. Images on the Canvas
      1. The Basic File Setup for This Chapter
      2. Image Basics
      3. Simple Cell-Based Sprite Animation
      4. Advanced Cell-Based Animation
      5. Applying Rotation Transformations to an Image
      6. Creating a Grid of Tiles
      7. Zooming and Panning an Image
      8. Pixel Manipulation
      9. Copying from One Canvas to Another
      10. What’s Next
    8. 5. Math, Physics, and Animation
      1. Moving in a Straight Line
      2. Bouncing Off Walls
      3. Curve and Circular Movement
      4. Simple Gravity, Elasticity, and Friction
      5. Easing
      6. What’s Next?
    9. 6. Mixing HTML5 Video and Canvas
      1. HTML5 Video Support
      2. Converting Video Formats
      3. Basic HTML5 Video Implementation
      4. Preloading Video in JavaScript
      5. Video and the Canvas
      6. Video on the Canvas Examples
      7. Animation Revisited: Moving Videos
      8. What’s Next?
    10. 7. Working with Audio
      1. The Basic <audio> Tag
      2. Audio Formats
      3. Audio Tag Properties, Functions, and Events
      4. Playing a Sound with No Audio Tag
      5. Creating a Canvas Audio Player
      6. Case Study in Audio: Space Raiders Game
      7. What’s Next
    11. 8. Canvas Game Essentials
      1. Why Games in HTML5?
      2. Our Basic Game HTML5 File
      3. Our Game’s Design
      4. Game Graphics: Drawing with Paths
      5. Animating on the Canvas
      6. Applying Transformations to Game Graphics
      7. Game Graphic Transformations
      8. Game Object Physics and Animation
      9. A Basic Game Framework
      10. Putting It All Together
      11. The player Object
      12. Geo Blaster Game Algorithms
      13. The Geo Blaster Basic Full Source
      14. Rock Object Prototype
      15. What’s Next
    12. 9. Combining Bitmaps and Sound
      1. Geo Blaster Extended
      2. Creating a Dynamic Tile Sheet at Runtime
      3. A Simple Tile-Based Game
      4. What’s Next
    13. 10. Mobilizing Games with PhoneGap
      1. Going Mobile!
      2. Creating the iOS Application with PhoneGap
      3. Beyond the Canvas
      4. What’s Next
    14. 11. Further Explorations
      1. 3D with WebGL
      2. Multiplayer Applications with ElectroServer 5
      3. Conclusion
    15. Index
    16. About the Authors
    17. Colophon
    18. SPECIAL OFFER: Upgrade this ebook with O’Reilly
O'Reilly logo

Creating a Canvas Audio Player

Now that we can play an audio file directly in an HTML page using the <audio> tag, or through JavaScript by creating a dynamic HTMLAudioElement object, it’s time to step up our game. We are going to create an audio player on the canvas that we can use to control dynamically loaded audio files. Why do we want to do this? Well, while the audio controls baked into HTML5-compliant browsers might look decent, it is often necessary for developers to implement a design that more closely matches a particular website. HTML5 Canvas provides a way to create a dynamic set of audio controls with nearly any look-and-feel you desire.

However, this flexibility comes at a cost. HTML5 Canvas does not natively support common GUI controls such as push buttons, toggle buttons, and sliders. So to create a decent audio player, we need to make these types of GUI user controls from scratch. We could create these controls in HTML and JavaScript, but we have already covered communication between HTML and Canvas via form controls several times in this book. You wanted to know how to make HTML5 Canvas apps when you started reading, so we won’t pull any punches in this chapter.

Creating Custom User Controls on the Canvas

For this application we are going to create four elements:

Play/pause push button

The audio file is either playing or is paused. Whichever state it is currently in, we show the other button (i.e., show pause when playing).

A sliding progress bar

This is a noninteractive slider. It displays how much of the audio track has played and how much is left to play. The movement of this bar needs to be dynamic and based on the duration and currentTime properties of the HTMLAudioElement object.

An interactive volume slider

We want to create a sliding volume control that the user can manipulate with a click-and-drag operation. This is the trickiest control we will build because Canvas has no native support for click-and-drag.

A loop toggle button

This is a bonus. Most of the default embedded HTML5 audio players do not have a loop/no-loop toggle button, but we are going to add one. Already, we are outstripping the functionality of standard HTML5!

Figure 7-5 shows the audiocontrols.png image that we created. It holds all the images we will use for the audio player. The top row consists of:

  • The play state of the play/pause button

  • The background of the play slider

  • The moving slider we will use for the play and volume sliders

The second row consists of:

  • The pause state of the play/pause button

  • The background of the volume slider

  • The “off” state of the loop button

  • The “on” state of the loop button

audiocontrols.png

Figure 7-5. audiocontrols.png

Loading the Button Assets

Since we are going to load in both an audio file and an image file for this application, we need to employ a strategy that will allow us to preload two assets instead of just one. This process is much like the one we employed in Chapter 6 when we created controls for a video. Previously in this chapter, we used a function named audioLoaded() to make sure the audio was loaded before we started use it. However, that strategy will not work when we have two assets to load. We could create two separate event listeners, but then what if we need to load 3, 4, or 10 assets? What we need is a simple way to ensure that we can preload any number of assets before our application executes.

We will start this process by creating some variables that are global in scope to all the functions in the applications. First, outside of all the JavaScript functions, we will create three new variables—loadCount, itemsToLoad, and buttonSheet:

loadCount

This variable will be used as a counter. When an asset has preloaded we will increment this value.

itemsToLoad

This is a numeric value that represents the number of assets we need to load before we can execute the application in the HTML page.

buttonSheet

This variable will hold a reference to the audiocontrols.png image shown in Figure 7-5. We will use it to create our audio controls.

Here is the code with values included:

var loadCount = 0;
var itemsToLoad = 2;
var buttonSheet;
var audioElement;

Note

To make these variables scope only to the Canvas app and not globally to all of JavaScript, you can encapsulate this code in a function(). The final version of the code in Example 7-6 shows that process.

Inside the eventWindowLoaded() function we now need to set the event handlers for the assets to load. For the audioElement, we will change the handler from audioLoaded to itemLoaded:

audioElement.addEventListener("canplaythrough",itemLoaded,false);

To load and use the audiocontrols.png image, we first create a new Image() object and place a reference to it into the buttonSheet variable. Next, we set the src attribute of the new Image object to the image file we want to load—in this case, audiocontrols.png. We then set the onload event handler of the Image object to itemLoaded, which is the same event handler we used for the audio file:

buttonSheet = new Image();
buttonSheet.onload = itemLoaded;
buttonSheet.src = "audiocontrols.png";

Now we need to create the itemLoaded() event handler. This function is quite simple. Every time it is called, we increment the loadCount variable. We then test loadCount to see whether it is equal to or has surpassed the number of items we want to preload, which is represented by the itemsToLoad variable. If so, we call the canvasApp() function to start our application:

function itemLoaded(event) {

   loadCount++;
   if (loadCount >= itemsToLoad) {
      canvasApp();

   }

}

Setting Up the Audio Player Values

Inside the canvasApp() function we need to create some values to help us place all the various buttons and sliders on the canvas.

First, bH represents the height of all the controls; bW represents the width of a standard button (play/pause, loop/not loop):

var bW = 32;
var bH = 32;

Next, we set the width of the playback area, playBackW, and the width of the volume background, volBackW. We also set the slider’s width (sliderW) and height (sliderH):

var playBackW = 206;
var volBackW = 50;
var sliderW = 10;
var sliderH = 32;

We also need a couple variables to represent the x and y locations on the canvas where we will start to build our audio controls. We will define those as controlStartX and controlStartY:

var controlStartX = 25;
var controlStartY = 200;

Finally, we need to specify the x and y locations for the play/pause button (playX, playY), the playing slider background (playBackX, playBackY), the volume slider background (volBackX, volBackY), and the location of the loop/no loop toggle button (loopX, loopY):

var playX = controlStartX;
var playY = controlStartY;
var playBackX = controlStartX+bW
var playBackY = controlStartY;
var volBackX = controlStartX+bW+playBackW;
var volBackY = controlStartY;
var loopX = controlStartX+bW+playBackW+volBackW
var loopY = controlStartY;

We are going to use all of these values to help design and add functionality to our audio controls. It may seem like overkill to create so many variables, but when trying to “roll your own” collision detection and drag-and-drop functionality into the canvas, having variable names to manipulate instead of literals makes the job much easier.

Mouse Events

Since we are going to create our own functions for interactivity between the mouse and our custom canvas audio controls, we need to add some event handlers for certain common mouse events.

First, we need to create a couple variables—mouseX and mouseY—that will hold the current x and y locations of the mouse pointer:

var mouseX;
var mouseY;

Next, we need to create the event handlers. First, we listen for the mouseup event. This event fires when a user stops pressing the mouse button. We will listen for this event when we are trying to determine whether we should stop dragging the volume slider:

theCanvas.addEventListener("mouseup",eventMouseUp, false);

We also need to listen for the mousedown event to determine whether the play/pause button was pressed, the loop on/off toggle button was pressed, and/or the volume slider was clicked so we can start dragging it:

theCanvas.addEventListener("mousedown",eventMouseDown, false);

Finally, we listen for mousemove so we can figure out the current x and y locations of the mouse pointer. We use this value to determine whether buttons have been pressed, as well as whether the volume slider has been clicked and/or dragged:

theCanvas.addEventListener("mousemove",eventMouseMove, false);

Sliding Play Indicator

The sliding play indicator is the simplest control we are going to draw onto the canvas. It is not interactive—it just gives the user a visual indication of how much of the audio clip is left to play.

First of all, in canvasApp() we need to make sure that we call the drawScreen() function on an interval, so our updated controls will be displayed:

setInterval(drawScreen, 33);

Note

Unlike when displaying video on the canvas, we do not have to call drawScreen() to update the playing audio. In JavaScript, audio plays completely separate from the canvas. Our need to call drawScreen() on an interval is necessary because the audio controls we are creating need to be updated as the audio plays.

In the drawScreen() function we need to draw the slider and background on the canvas. We are going to “cut” all the images we display from the single buttonSheet image we loaded from audiocontrols.png. To draw the background, we use the values we set up earlier. We use literals (i.e., 32,0) to locate the starting point of the image because those values do not change on the buttonSheet image. However, we use the variables we created to find the width and height, and to locate the final position of the background on the canvas:

context.drawImage(buttonSheet, 32,0,playBackW,bH,playBackX,playBackY,playBackW,bH);

Drawing the play slider is only a bit more complicated. Before we draw it, we need to create a variable that represents the relationship between the length of playing audio and the width of slider area. This is so we will know how far on the x-axis to move the slider based on how much of the song has played. This may sound complicated, but it’s just a simple fraction. Divide the width of the play background (playBackW) by the duration of the playing audio (audioElement.duration). We will store that ratio in sliderIncrement and use it to place the play slider on the canvas:

var slideIncrement = playBackW/audioElement.duration;

Now we need to calculate the x position of the slider. The x position is the sum of the slider’s starting position (the place on the canvas where the controls start plus the width of the play/pause button: controlStartX+bW) plus the audio’s current play position. We calculate the play position by taking the ratio we just created, sliderIncrement, and multiplying it by the current play time of the audio clip (audioElement.currentTime). That’s it!

var sliderX = (playBackW,bH,controlStartX+bW) + 
    (slideIncrement*audioElement.currentTime);

Now all we need to do is draw the image onto the canvas, and then test to see whether the audio clip has ended. If it has ended, we put the play position back to the beginning of the playback area and call audioElement.pause() to pause the audio clip. That is, unless the loop property is sent, in which case we start playing the audio clip from the beginning by setting the currentTime property to 0:

context.drawImage(buttonSheet, 238,0,sliderW,bH,sliderX,controlStartY,sliderW,bH);

if (audioElement.ended && !audioElement.loop) {
   audioElement.currentTime = 0;
   audioElement.pause();
}

This leads us right into our next topic, handling the play/pause button.

Play/Pause Push Button: Hit Test Point Revisited

The first thing we need to do when implementing the play/pause button is create the event handler for the mousemove event. The function really is just the standard cross-browser code we introduced earlier in the book for tracking the mouse position, depending on which properties the DOM in browsers supports: layerX/layerY or offsetX/offsetY. This function is called every time the mouse is moved on the canvas to update the mouseX and mouseY variables. Those variables are scoped to canvasApp() so all functions defined inside of it can access them:

function eventMouseMove(event) {
      if ( event.layerX ||  event.layerX == 0) { // Firefox
            mouseX = event.layerX ;
         mouseY = event.layerY;
        } else if (event.offsetX || event.offsetX == 0) { // Opera
          mouseX = event.offsetX;
         mouseY = event.offsetY;
        }

   }

Now we need to create the eventMouseUp() handler function. This function is called when the user releases the mouse button after clicking. Why after and not when the mouse is clicked? Well, one reason is because we generally use the mousedown event for the start of a “dragging” operation, which we will show you shortly.

The heart of this function is a hit test point-style collision detection check for the buttons. We discussed this in depth in Chapter 6 when we created the buttons for the video puzzle game (CH6EX10.html). Notice that here we are using the variables we create to represent the x and y locations of the button (playX, playY) and the width and height of a button (bW, bH) to form the bounds of the area we will test. If the mouse pointer is within those bounds, we know the button has been clicked:

function eventMouseUp(event) {

if ( (mouseY >= playY) && (mouseY <= playY+bH) && (mouseX >= playX) && 
     (mouseX <= playX+bW) ) {

Note

If you had images stacked on top of one another, you would need to store some kind of stacking value or z-index to know which item was on top and was clicked at any one time. Because the canvas works in immediate mode, you would have to “roll your own” just like the other functionality we have discussed.

After a hit is detected, we need to determine whether we are going to call the play() or pause() method of the HTMLAudioElement object represented by the audioElement variable. To figure out which method to call, we simply test to see whether the audio is paused by checking the audioElement.paused property. If so, we call the play() method; if not, we call pause(). Recall that the HTMLAudioElement.paused property is set to true if the audio is not playing, regardless of whether the paused() function was called. This means that when the application starts but we have not set autoplay, we can easily display the proper button (play or pause) just by testing this property:

      if (audioElement.paused) {
         audioElement.play();

      } else {
         audioElement.pause();

      }

   }
}

Now, in drawScreen() we need to choose which button to display: the one representing play (green triangle) or pause (two horizontal boxes). The play button is displayed when the audio is paused, and the pause button is displayed when the audio is playing. This button is a “call to action,” so it displays what will happen when you click on it, not the status of the audio element that is playing. This inverse relationship exists because it is the standard way audio players work.

If the audioElement is paused, we display the graphic from the top row of the audiocontrols.png image represented by buttonSheet (see Figure 7-5). If it is not paused, we display the button on the second row right below it. Since that button starts at the y position of 32, we use that literal value in the call to drawImage():

if (audioElement.paused) {
   context.drawImage(buttonSheet, 0,0,bW,bH,playX,playY,bW,bH);//show play

} else {
   context.drawImage(buttonSheet, 0,32,bW,bH,playX,playY,bW,bH); //show pause

}

Note

Again, we could have represented the literal values of locations in the buttonSheet with variables, but we decided to use literals to show you the difference between how we specify buttonSheet pixel locations, and how we calculate widths and distances for placing those elements.

Loop/No Loop Toggle Button

Implementing the loop/no loop toggle button is nearly identical to implementing the play/pause button. In Figure 7-5, you can see that the last two buttons on the bottom row represent the “on” and “off” states of the loop/no loop button. Unlike the play/pause button, this button shows the “state” of looping: the lighter, 3D-looking “out” button is displayed when the audio is not set to loop. The inverse, darker button is displayed when the audio is set to loop (because it looks like the button has been pressed).

In the eventMouseUp() function, we need to add support for loop/no loop. First, we test for a hit test point on the button with the current location of the mouse pointer. This is identical to the test we did for the play/pause button, except that we use loopX and loopY to find the current location of the loop/no loop button.

Next, we check the value of audioElement.loop. We need to update the value to the opposite of the current setting. If loop is true, we set it to false; if it is false, we set it to true:

if ( (mouseY >=loopY) && (mouseY <= loopY+bH) && (mouseX >= loopX) && 
     (mouseX <= loopX+bW) ) {
   if (audioElement.loop) {
            audioElement.loop = false;

         } else {
            audioElement.loop = true;

         }

Finally, in drawScreen() we will display the proper part of the buttonSheet image for whichever state of loop/no loop is currently set. Unlike play/pause, we display the “off” state when loop is false and the “on” state when it is set to true because, again, there is not an inverse relationship to the states of the button:

if (audioElement.loop) {
         context.drawImage(buttonSheet, 114,32,bW,bH,loopX,loopY,bW,bH);//loop

      } else {
         context.drawImage(buttonSheet, 82,32,bW,bH,loopX,loopY,bW,bH); // no loop
      }

Click-and-Drag Volume Slider

So now we make it to the last, but certainly not least, piece of functionality for the audio player: the volume slider. The volume slider is an interactive control allowing the user to manipulate it by sliding it right or left to control the volume of the playing audio element. Before we create the volume slider, we need to define some boundaries for its usage:

  • The slider never moves on the y-axis; it will always keep a constant y value.

  • The farther the volume slider is to the right (the greater the x value), the higher the volume.

  • The slider moves on the x-axis but is bounded by the starting x value of the volume slider image—volumeSliderStart on the left and volumeSliderEnd on the right.

  • When the user clicks on the volume slider, we will assume that the user wants to set the volume, so we will start “dragging” the slider. This means that if the user moves the mouse on the x-axis, we will move the slider accordingly.

  • When the user takes his finger off the mouse button, we will assume that he no longer wishes to set the volume, and we still stop “dragging” the slider.

  • The volume will be set based on the slider’s position on the x-axis in relation to the volumeSliderStart plus a ratio (volumeIncrement) that we create describing how much volume to increase or decrease based on where the slider rests.

Volume slider variables

OK, now that we have thoroughly confused you, let’s talk about the process in depth. First, we start with the canvasApp() function. In canvasApp() we need to set up some variables to set the rules we defined in the list above.

The starting x position for the volume slider is volumeSliderStart. When the application starts, it is equal to the x position of the volume background, or volBackX. This means it will start at the leftmost edge of the volume slider where the volume will be set to 0. We will update this to the correct position based on the volume as soon as we calculate that value:

var volumeSliderStart = volBackX;

The final x position for the volume slider is volumeSliderEnd, which is the rightmost position. It is the position where the volume will be set to 100% (or 1). This position lies at the x position of volumeSliderStart plus the width of the volume slider background (volBackW), less the width of the volume slider itself (sliderW):

var volumeSliderEnd = volumeSliderStart + volBackW - sliderW;

volumeSliderX and volumeSliderY are the slider’s x and y positions on the canvas. The y position is the same as the other elements in the audio player, controlStartY. However, the x position is calculated in quite a different way. First, we take the value of volumeSliderStart and add the difference between slider volume background width and the slider width (volBackW – sliderW), multiplied by the volume property of the audioElement, which is a number between 0 and 1. This will give us the position relative to the starting point from which we want to draw the volume slider for any given volume setting:

var volumeSliderX  = volumeSliderStart + (audioElement.volume*
    (volBackW - sliderW));
var volumeSliderY  = controlStartY;

Next, we create the volumeSliderDrag variable, which we will use as a switch to tell us whether the volume slider is being dragged by the user at any given moment:

var volumeSliderDrag = false;

Finally, we create the volumeIncrement variable. This variable tells us how much volume to increase or decrease on the audioElement.volume property based on where the slider is positioned on the volume background. Since the maximum value of the volume is 1, we simply find the total width that the volume slider can move on the x-axis (volBackW - sliderW) and divide 1 by that value. This will give us a product that we can multiply by the x position of the slider, relative to volumeSliderStart, to give us the volume we should set for the audioElement:

var volumeIncrement = 1/(volBackW - sliderW);

Volume slider functionality

Now that we have discussed the variables we need for the volume slider, we will talk about how we use them in the various functions of the audio player. The good news is that the implementation is simple now that you know how the variables work.

In the eventMouseDown() handler, we perform a hit test point-style test, just like we did with the play/pause and loop/no loop buttons to see whether the volume slider was clicked. If so, we set the volumeSliderDrag variable to true. This means that the volume slider will now to move to the x position of the mouse when we call drawScreen():

function eventMouseDown(event) {

if ( (mouseY >= volumeSliderY) && (mouseY <=volumeSliderY+sliderH) && 
     (mouseX >= volumeSliderX) && (mouseX <= volumeSliderX+sliderW) ) {
         volumeSliderDrag = true;

      }

   }

In the eventMouseUp() handler, we test to see whether the volumeSliderDrag is set to true. If so, it means that the user has released the mouse button and no longer wants to drag the volume slider. We set volumeSliderDrag to false so the slider will not move with the mouse:

if (volumeSliderDrag) {
         volumeSliderDrag = false;
      }

In drawScreen() we actually put the pixels to the canvas, so to speak, with the volume slider. First, we draw the background image from buttonSheet:

//vol Background
      context.drawImage(buttonSheet, 32,32,volBackW,bH,volBackX,volBackY,volBackW,bH);

Next, we check to see whether volumeSliderDrag is true. If so, we make the volumeSliderX variable equal to the mouse’s x position. Then, we drop in a couple more tests to determine whether the x position of the volume slider falls outside the bounds of the volume background. These two tests make sure that the volume slider does not move past the rightmost or leftmost sides of the volume slider background, and in turn, the volume property of audioElement is not calculated to be more than 1 or less than 0:

if (volumeSliderDrag) {
   volumeSliderX = mouseX;
   if (volumeSliderX > volumeSliderEnd) {
      volumeSliderX = volumeSliderEnd;
   }
   if (volumeSliderX < volumeSliderStart) {
      volumeSliderX = volumeSliderStart;
   }

If the volumeSliderDrag is false, we still need an x position at which to draw the slider graphic. We get this the same way we calculated the volumeSliderX when we initialized the variable in the canvasApp() function:

} else {
   volumeSliderX = volumeSliderStart + (audioElement.volume*
    (volBackW -sliderW));
}

Finally, we draw the slider onto the canvas:

context.drawImage(buttonSheet, 238,0,sliderW,bH,volumeSliderX,
    volumeSliderY, sliderW,bH);
audioElement.volume = (volumeSliderX-volumeSliderStart) * volumeIncrement;

Figure 7-6 displays the custom controls in the browser.

Canvas sound player with custom controls

Figure 7-6. Canvas sound player with custom controls

So there you have it. You can test the audio player as CH7EX5.html in the source code. The full code listing for the HTML5 Canvas audio player is shown in Example 7-5.

Example 7-5. A custom audio player on the canvas

<!doctype html>
<html lang="en">
<head>
<meta charset="UTF-8">
<title>CH7EX5: A Custom Sound Player On The Canvas</title>
<script src="modernizr-1.6.min.js"></script>
<script type="text/javascript">
window.addEventListener('load', eventWindowLoaded, false);
var loadCount = 0;
var itemsToLoad = 2;
var buttonSheet;
var audioElement;
function eventWindowLoaded() {

   audioElement = document.createElement("audio");
   document.body.appendChild(audioElement);
   var audioType = supportedAudioFormat(audioElement);
   if (audioType == "") {
      alert("no audio support");
      return;
   }
   audioElement.setAttribute("src", "song1." + audioType);
   audioElement.addEventListener("canplaythrough",itemLoaded,false);

   buttonSheet = new Image();
   buttonSheet.onload = itemLoaded;
   buttonSheet.src = "audiocontrols.png";

}

function supportedAudioFormat(audio) {
   var returnExtension = "";
   if (audio.canPlayType("audio/ogg") =="probably" || 
       audio.canPlayType("audio/ogg") == "maybe") {
         returnExtension = "ogg";
   } else if(audio.canPlayType("audio/wav") =="probably" || 
       audio.canPlayType("audio/wav") == "maybe") {
         returnExtension = "wav";
   } else if(audio.canPlayType("audio/mp3") == "probably" || 
       audio.canPlayType("audio/mp3") == "maybe") {
         returnExtension = "mp3";
   }

   return returnExtension;

}

function canvasSupport () {
    return Modernizr.canvas;
}

function itemLoaded(event) {

   loadCount++;
   if (loadCount >= itemsToLoad) {
      canvasApp();

   }

}

function canvasApp() {

  if (!canvasSupport()) {
          return;
        }

  function  drawScreen () {

      //Background

      context.fillStyle = "#ffffaa";
      context.fillRect(0, 0, theCanvas.width, theCanvas.height);

      //Box
      context.strokeStyle = "#000000";
      context.strokeRect(5,  5, theCanvas.width−10, theCanvas.height−10);

      // Text
      context.fillStyle = "#000000";
      context.fillText  ("Duration:" + audioElement.duration,  20 ,20);
      context.fillText  ("Current time:" + audioElement.currentTime,  250 ,20);
      context.fillText  ("Loop: " + audioElement.loop,  20 ,40);
      context.fillText  ("Autoplay: " +audioElement.autoplay,  250 ,40);
      context.fillText  ("Muted: " + audioElement.muted,  20 ,60);
      context.fillText  ("Controls: " + audioElement.controls,  250 ,60);
      context.fillText  ("Volume: " + audioElement.volume,  20 ,80);
      context.fillText  ("Paused: " + audioElement.paused,  250 ,80);
      context.fillText  ("Ended: " + audioElement.ended,  20 ,100);
      context.fillText  ("Can Play OGG: " + audioElement.canPlayType("audio/ogg"),  
                        250 ,100);
      context.fillText  ("Can Play WAV: " + audioElement.canPlayType("audio/wav"),  
                        20 ,120);
      context.fillText  ("Can Play MP3: " + audioElement.canPlayType("audio/mp3"),  
                        250 ,120);
      context.fillText  ("Source: " + audioElement.currentSrc, 20 ,140);
      context.fillText  ("volumeSliderDrag: " + volumeSliderDrag, 20 ,160);

      //Draw Controls

      //play or pause

      if (audioElement.paused) {
         context.drawImage(buttonSheet, 0,0,bW,bH,playX,playY,bW,bH);//show play

      } else {
         context.drawImage(buttonSheet, 0,32,bW,bH,playX,playY,bW,bH); //show pause

      }

      //loop

      if (audioElement.loop) {
         context.drawImage(buttonSheet, 114,32,bW,bH,loopX,loopY,bW,bH);//show loop
      } else {
         context.drawImage(buttonSheet, 82,32,bW,bH,loopX,loopY,bW,bH); //show no loop
      }

      //play background
      context.drawImage(buttonSheet, 32,0,playBackW,bH,playBackX,playBackY,playBackW,bH);

      //vol Background
      context.drawImage(buttonSheet, 32,32,volBackW,bH,volBackX,volBackY,volBackW,bH);

      //play slider
      var slideIncrement = playBackW/audioElement.duration;
      var sliderX = (playBackW,bH,controlStartX+bW) + 
          (slideIncrement*audioElement.currentTime);
      context.drawImage(buttonSheet, 238,0,sliderW,bH,sliderX,controlStartY,sliderW,bH);

      //Go back to start
      if (audioElement.ended && !audioElement.loop) {
         audioElement.currentTime = 0;
         audioElement.pause();
      }

      //Volume slider
      //Test Volume Drag

      if (volumeSliderDrag) {
         volumeSliderX = mouseX;
         if (volumeSliderX > volumeSliderEnd) {
            volumeSliderX = volumeSliderEnd;
         }
         if (volumeSliderX < volumeSliderStart) {
            volumeSliderX = volumeSliderStart;
         }
      } else {
         volumeSliderX = volumeSliderStart + (audioElement.volume*(volBackW -sliderW));
      }

      context.drawImage(buttonSheet, 238,0,sliderW,bH,volumeSliderX,volumeSliderY,
          sliderW,bH);
      audioElement.volume = (volumeSliderX-volumeSliderStart) * volumeIncrement;

   }

   function eventMouseDown(event) {

      //Hit Volume Slider
      if ( (mouseY >= volumeSliderY) && (mouseY <=volumeSliderY+sliderH) && 
           (mouseX >= volumeSliderX) && (mouseX <= volumeSliderX+sliderW) ) {
         volumeSliderDrag = true;

      }

   }

   function eventMouseMove(event) {
      if ( event.layerX ||  event.layerX == 0) {
            mouseX = event.layerX ;
         mouseY = event.layerY;
        } else if (event.offsetX || event.offsetX == 0) {
          mouseX = event.offsetX;
         mouseY = event.offsetY;
        }

   }

   function eventMouseUp(event) {

      //Hit Play
      if ( (mouseY >= playY) && (mouseY <= playY+bH) && (mouseX >= playX) && 
           (mouseX <= playX+bW) ) {
         if (audioElement.paused) {
            audioElement.play();

         } else {
            audioElement.pause();

         }

      }

      //Hit loop
      if ( (mouseY >=loopY) && (mouseY <= loopY+bH) && (mouseX >= loopX) && 
           (mouseX <= loopX+bW) ) {
         if (audioElement.loop) {
            audioElement.loop=false;

         } else {
            audioElement.loop = true;

         }

      }

      if (volumeSliderDrag) {
         volumeSliderDrag = false;
      }

   }

   var theCanvas = document.getElementById("canvasOne");
   var context = theCanvas.getContext("2d");

   var bW = 32;
   var bH = 32;
   var playBackW = 206;
   var volBackW = 50;
   var sliderW = 10;
   var sliderH = 32;
   var controlStartX = 25;
   var controlStartY =200;
   var playX = controlStartX;
   var playY = controlStartY;
   var playBackX = controlStartX+bW;
   var playBackY = controlStartY;
   var volBackX = controlStartX+bW+playBackW;
   var volBackY = controlStartY;
   var loopX = controlStartX+bW+playBackW+volBackW;
   var loopY = controlStartY;
   var mouseX;
   var mouseY;

   theCanvas.addEventListener("mouseup",eventMouseUp, false);
   theCanvas.addEventListener("mousedown",eventMouseDown, false);
   theCanvas.addEventListener("mousemove",eventMouseMove, false);

   audioElement.play();
   audioElement.loop = false;
   audioElement.volume = .5;
   var volumeSliderStart = volBackX;
   var volumeSliderEnd = volumeSliderStart + volBackW -sliderW;
   var volumeSliderX = volumeSliderStart + (audioElement.volume*(volBackW -sliderW));
   var volumeSliderY = controlStartY;
   var volumeSliderDrag = false;
   var volumeIncrement = 1/(volBackW -sliderW);

   setInterval(drawScreen, 33);

}

</script>

</head>
<body>
<div style="position: absolute; top: 50px; left: 50px;">

<canvas id="canvasOne" width="500" height="300">
 Your browser does not support HTML5 Canvas.
</canvas>
</div>
</body>
</html>

The best content for your career. Discover unlimited learning on demand for around $1/day.