Skip to content

Commit

Permalink
first draft
Browse files Browse the repository at this point in the history
  • Loading branch information
tonylukasavage committed Nov 14, 2011
1 parent 19c4992 commit 2977357
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion ep-015/vo.md
Original file line number Diff line number Diff line change
@@ -1 +1 @@
Hello and welcome to another episode of Forging Titanium. Today is part 2 in the 3 part Twisti app series. In part one we saw how we can create a native Android module to leverage additional sensor data, and native processing performance, to determine the physical orientation of an Android device. If you missed part 1, I highly suggest checking it out before digging into this episode. We'll be picking up right where we left off last time. In _this_ screencast, we're going to use that sensor data to render a 3 dimensional representation of our devices. To do so, we'll use the open source 3D Javascript engine, Three.js, along with the <canvas> element in a Titanium WebView. It's the best of both worlds: powerful platform specific functionality and native performance combined with the open compatibility of web-based rendering.In case you missed the sneak peek in last week's episode, let's take a quick look at what Twisti looked like last week, and what we are going to do with it today...### LIVE VIDEO ###So while this stuff looks really cool, I'm betting at least a few people watching are a little intimidated by the idea of doing their own 3D rendering and animation. But don't worry, we have Three.js to make the topic _much_ more accessible. In its author's own words...> The aim of the project is to create a lightweight 3D engine with a very low level of complexityThis library has abstracted away all the gritty details of 3D rendering and leaves us with a simple Javascript API to manage. Something you Titanium developers should be very familiar with by now. Let's go out to github and check out the Three.js repository. In here we'll find everything we need to get started with Three.js, including the library itself, documentation, and some pretty cool examples. The bonus here is that this technology is totally web-based. This means that the 3D rendering that we'll show off in our mobile app will work just as well on a website, as we'll soon see in the examples below. While its worthwhile to download the entire repository, we'll only need 2 files for Twisti. If we go to the `build` directory, we'll find the core Three.js file already minified. We'll need to include this in our Titanium project. For maximum compatibility we'll need one more file. If we go to `exmaples\js`, we'll find the RequestAnimationFrame.js file. This file allows us to specify which function we will use for controlling our render loop. When possible, Three.js will use your browser's native functionality. If no such function is available, as is the case in many mobile browsers, the RequestAnimationFrame.js file provides a simple timeout callback structure for running your render loop. We'll download and include this file in our project as well.Now that we know what we need to use Three.js, let's take a look at some of the <canvas> examples, as that's the rendering context we'll be using. Clicking through here we see all kinds of really cool, interactive examples. We can render particles... interaction... 3D environments... and a whole bunch of other neat stuff...What we're going to do is create a simple 6 sided mesh to represent a mobile device. With Three.js, we'll use basic texturing to make it _look_ like a phone, render the mesh to the <canvas>, and then animate it using the device's sensor data. To do so, we are going to make only slight modifications to the basic "Hello, World!" example found here.I won't give a course on 3D programming here for 2 reasons. 1, it's well beyond the scope of this screencast. And 2, any course I would give would likely be widely discredited as I'm no 3D pro. But that's just a testament to how easy it can be to incorporate really stunning visual effects and interaction in your Titanium apps with this library.The most basic setup, shown here, includes creating a scene for your rendering, creating and positioning a camera that represents your user's viewport, putting objects, or meshes, into the scene, then initiating the render loop which will render your scene to the <canvas>. In your render loop is where you'll often make changes to your scene, which will appear as animations.OK, let's fire up Titanium Studio and see how I built this functionality into a Titanium mobile app. If you're familiar with Part 1 of the Twisti series, the app.js will look _very_ familiar. We needed to make only a few critical changes to add Three.js rendering to our apps.The first thing you'll note is the `needsCalibration` variable. As mentioned in the demo earlier, calibrating Twisti will create an offset for the azimuth so that your mobile device appears to be facing the correct direction in the app. This variable determines whether the calibrate button has been clicked. We'll discuss it more in just a moment.The rest of the UI construction has only minor changes. We of course add the webview that will hold the <canvas> element, as well as a simple toolbar underneath it to house our calibration button. Under that we'll display the various sensor data readings, just as we did in Part 1.
Hello and welcome to another episode of Forging Titanium. Today is part 2 in the 3 part Twisti app series. In part one we saw how we can create a native Android module to leverage additional sensor data, and native processing performance, to determine the physical orientation of an Android device. If you missed part 1, I highly suggest checking it out before digging into this episode. We'll be picking up right where we left off last time. In _this_ screencast, we're going to use that sensor data to render a 3 dimensional representation of our devices. To do so, we'll use the open source 3D Javascript engine, Three.js, along with the <canvas> element in a Titanium WebView. It's the best of both worlds: powerful platform specific functionality and native performance combined with the open compatibility of web-based rendering.In case you missed the sneak peek in last week's episode, let's take a quick look at what Twisti looked like last week, and what we are going to do with it today...### LIVE VIDEO ###So while this stuff looks really cool, I'm betting at least a few people watching are a little intimidated by the idea of doing their own 3D rendering and animation. But don't worry, we have Three.js to make the topic _much_ more accessible. In its author's own words...> The aim of the project is to create a lightweight 3D engine with a very low level of complexityThis library has abstracted away all the gritty details of 3D rendering and leaves us with a simple Javascript API to manage. Something you Titanium developers should be very familiar with by now. Let's go out to github and check out the Three.js repository. In here we'll find everything we need to get started with Three.js, including the library itself, documentation, and some pretty cool examples. The bonus here is that this technology is totally web-based. This means that the 3D rendering that we'll show off in our mobile app will work just as well on a website, as we'll soon see in the examples below. While its worthwhile to download the entire repository, we'll only need 2 files for Twisti. If we go to the `build` directory, we'll find the core Three.js file already minified. We'll need to include this in our Titanium project. For maximum compatibility we'll need one more file. If we go to `exmaples\js`, we'll find the RequestAnimationFrame.js file. This file allows us to specify which function we will use for controlling our render loop. When possible, Three.js will use your browser's native functionality. If no such function is available, as is the case in many mobile browsers, the RequestAnimationFrame.js file provides a simple timeout callback structure for running your render loop. We'll download and include this file in our project as well.Now that we know what we need to use Three.js, let's take a look at some of the <canvas> examples, as that's the rendering context we'll be using. Clicking through here we see all kinds of really cool, interactive examples. We can render particles... interaction... 3D environments... and a whole bunch of other neat stuff...What we're going to do is create a simple 6 sided mesh to represent a mobile device. With Three.js, we'll use basic texturing to make it _look_ like a phone, render the mesh to the <canvas>, and then animate it using the device's sensor data. To do so, we are going to make only slight modifications to the basic "Hello, World!" example found here.I won't give a course on 3D programming here for 2 reasons. 1, it's well beyond the scope of this screencast. And 2, any course I would give would likely be widely discredited as I'm no 3D pro. But that's just a testament to how easy it can be to incorporate really stunning visual effects and interaction in your Titanium apps with this library.The most basic setup, shown here, includes creating a scene for your rendering, creating and positioning a camera that represents your user's viewport, putting objects, or meshes, into the scene, then initiating the render loop which will render your scene to the <canvas>. In your render loop is where you'll often make changes to your scene, which will appear as animations.OK, let's fire up Titanium Studio and see how I built this functionality into a Titanium mobile app. If you're familiar with Part 1 of the Twisti series, the app.js will look _very_ familiar. We needed to make only a few critical changes to add Three.js rendering to our apps.The first thing you'll note is the `needsCalibration` variable. As mentioned in the demo earlier, calibrating Twisti will create an offset for the azimuth so that your mobile device appears to be facing the correct direction in the app. This variable determines whether the calibrate button has been clicked. We'll see exactly how it's used in just a moment.The rest of the UI construction has only minor changes. We of course add the webview that will hold the <canvas> element, defined by the `web\index.html` file. In here we see a very simple HTML document, doing little more than including the Three.js files and our own logic in the `twistiScene.js` file. The canvas that will be created by the `Twisti.createScene()` call will be doing all the heavy lifting, and we'll discuss it more in a little bit.A simple toolbar is put underneath the webview to house our calibration button. Under that we'll display the various sensor data readings, just as we did in Part 1.In the Twisti event listener, we update all of the sensor reading labels, as we did last time. In addition, we also fire an application level event that contains the current azimuth, pitch, and roll of our device. The event also indicates whether or not the azimuth needs to be recalibrated. We use an application level event here because that's the only way for native Titanium code to communicate with a webview's own Javascript code. If we jump ahead quick to the Javascript from the `twistiScene.js` file, we see that the `app:updateRotation` event can be listened for, and the data from our native Titanium code can be used to modify the settings of the Three.js mesh. This is a great example of Titanium allowing us to cross the boundary between native code and cross platform web-based code.Going back to the top of the `twistiScene.js` file, we see code very similar to the "Hello, World!" from the Three.js site. The first difference is that we create an array of 6 materials to apply to the mesh we will create. Each material is used by a specific face of our soon-to-be 6-sided mesh. As you can see, it's easy to specify images or colors for your materials. There's also tons of other fun materials that we won't get into here, like shaded or reflective surfaces.let's scroll down to where we create our mesh, which in this case will represent a phone. The `THREE.CubeGeometry` function defines the shape of our mesh. The first 3 arguments are the width, height, and depth, respectively. The next 3 arguments represent the number of segments you would like the width, height, and depth to have. You can leave these at 1 for now as we won't be changing the shape of the mesh at all. The final argument we use is the materials array. This is used to apply the materials we created to the phone mesh.Finally, as we saw before, we have our application level event listener. Here Three.js will receive updates from our Titanium code and animate our phone mesh. We use adjust the azimuth, if necessary, then rotate the mesh around its axes given the sensor data. This is what makes the mesh move as we move the device.We finish up by initiating the render loop, then exposing the createScene method via the Twisti namespace. This is the function we call in index.html to create the canvas and render our scene...In this episode of Forging Titanium, we used the sensor data we got in Part 1 of the twisti app series to create a 3 dimensional model of our mobile device. We used Three.js and Titanium's webview component to render an animated 3d scene in an HTML5 <canvas> element. With a modest amount of code we were able to leverage cross platform web-based rendering while still making use of Titanium's native features.In the third and final part of this series, we'll see how we can use an Android device as a sort of sensor proxy. We'll send data in realtime, via Titanium sockets, to multiple other devices. Those devices will then be able to render the 3d model of the remote sensor, just as we did locally here. I hope to you'll all be back for part 3. See ya then.
Expand Down

0 comments on commit 2977357

Please sign in to comment.