Click here to Skip to main content
15,881,089 members
Articles / Internet of Things
Article

Blend the Intel® RealSense™ Camera and the Intel® Edison Board with JavaScript

22 Dec 2015CPOL10 min read 20.1K   2  
Using Intel® RealSense™ technology, this project modifies the JavaScript framework sample script to send captured gesture data to the Node.js server.

This article is in the Product Showcase section for our sponsors at CodeProject. These articles are intended to provide you with information on products and services that we consider useful and of value to developers.

Introduction

Smart devices can now connect to things we never before thought possible. This is being enabled by the Internet of Things (IoT), which allows these devices to collect and exchange data.

Intel has created Intel® RealSense™ technology, which includes the Intel® RealSense™ camera and the Intel® RealSense™ SDK. Using this technology, you can create applications that detect gesture and head movement, analyze facial data, perform background segmentation, read depth level, recognize and synthesize voice and more. Imagine that you are developing a super sensor that can detect many things. Combined with the versatile uses of the Intel® Edison kit and its output, you can build creative projects that are both useful and entertaining.

The Intel® RealSense™ SDK provides support to popular programming language and frameworks such as C++, C#, Java*, JavaScript*, Processing, and Unity*. This means that developers can get started quickly using a programming environment they are familiar with.

Peter Ma’s article, Using an Intel® RealSense™ 3D Camera with the Intel® Edison Development Platform, presents two examples of applications using C#. The first uses the Intel® RealSense™ camera as input and the Intel® Edison board as output. The result is that if you spread your fingers in front of Intel® RealSense™ camera, it sends a signal to the Intel® Edison board to turn on the light.

In the second example, Ma reverses the flow, with the Intel® Edison board as input and the Intel® RealSense™ camera as output. The Intel® Edison board provides data that comes from a sensor to be processed and presents it to us through the Intel® RealSense™ camera as voice synthesis to provide more humanized data.

Ma’s project inspired me to build something similar, but using JavaScript* instead of C#. I used the Intel® RealSense™ SDK to read and send hand gesture data to a node.js server, which then sends the data to the Intel® Edison board to trigger a buzzer and LED that are connected to it.

About the Project

This project is written in JavaScript*. If you are interested in implementing only a basic gesture, the algorithm module is already in the Intel® RealSense™ SDK. It gives you everything you need.

Hardware

Requirements:

Intel® Edison board with the Arduino breakout board

The Intel® Edison board is a low-cost, general-purpose computer platform. It uses a 22nm dual-core Intel® Atom™ SoC running at 500 MHz. It supports 40 GPIOs and includes 1 GB LPDDR3 RAM, 4 GB EMMC for storage, dual-band Wi-Fi, and Bluetooth, and has a small size.

The board runs the Linux* kernel and is compatible with Arduino, so it can run an Arduino implementation as a Linux* program.

Image 1

Figure 1. Intel® Edison breakout board kit.

Grove Starter Kit Plus - Intel® XDK IoT Edition

Grove Starter Kit Plus - Intel® XDK IoT Edition is designed for the Intel® Galileo board Gen 2, but it is fully compatible with the Intel® Edison board via the breakout board kit.

The kit contains sensors, actuators, and shields, such as a touch sensor, light sensor, and sound sensor, and also contains an LCD display as shown in Figure 2. This kit is an affordable solution for developing an IoT project.

You can purchase the Grove Starter Kit Plus here:

Image 2

Figure 2. Grove* Starter Kit Plus - Intel® XDK IoT Edition

Intel® RealSense™ Camera

The Intel® RealSense™ camera is built for game interactions, entertainment, photography, and content creation with a system-integrated or a peripheral version. The camera’s minimum requirements are a USB 3.0 port, a 4th gen Intel Core processor, and 8 GB of hard drive space.

The camera (shown in Figure 3) features full 1080p color and an in-depth sensor and gives the PC a 3D visual and immersive experience.

Image 3

Figure 3. Intel® RealSense™ camera

You can purchase the complete developer kit, which includes the camera here.

GNU/Linux* server

A GNU/Linux* server is easy to develop. You can use an old computer or laptop or you can put the server on a cloud. I used a cloud server with an Ubuntu* server. If you have different Linux* flavors for the server, just adapt to your favorite command.

Software

Before we start to develop the project, make sure you have the following software installed on your system. You can use the links to download the software.

Set Up the Intel® RealSense™ Camera

To set up the Intel® RealSense™ camera, connect the Intel® RealSense™ camera (F200) to the USB 3.0 port, and then install the driver as the camera connected to your computer. Navigate to the Intel® RealSense™ SDK location, and open the JavaScript* sample on your browser:

Install_Location\RSSDK\framework\JavaScript\FF_HandsViewer\FF_HandsViewer.html

Image 4

After the file opens, the scripts checks to see what platform you have. While the script is checking your platform, click the link on your web browser to install the Intel® RealSense™ SDK WebApp Runtime.

When the installation is finished, restart your web browser, and then open the file again. You can check to see that the installation was a success by raising your hand in front of the camera. It should show your hand gesture data visualized on your web browser.

Gesture Set Up

Image 5

The first key code line that enables gesture looks like the following:

JavaScript
{
"timeStamp":130840014702794340 ,
"handId": 4,
"state": 0,
"frameNumber":1986 ,
"name":"spreadfinger"
} 

This sends "name":"spreadfingers" to the server to be processed.

Next, we will write some JavaScript* code to stream gesture data from the Intel® RealSense™ camera to the Intel® Edison board through the node.js server.

Working with JavaScript*

Finally, we get to do some programming. I suggest that you first move the whole folder because the default installation doesn’t allow the original folder to be rewritten.

Copy the FF_HandsViewer folder from this location and paste it somewhere else. The folder’s location is:

\install_Location\RSSDK\framework\JavaScript\FF_HandsViewer\

Eventually, you will be able to create your own project folder to keep things managed.

Next, copy the realsense.js file from the location below and paste it inside the FF_HandsViewer folder:

Install_Location\RSSDK\framework\common\JavaScript

To make everything easier, let’s create one file named edisonconnect.js. This file will receive gesture data from the Intel® RealSense™ camera and send it to the node.js server. Remember that you have to change the IP address on the socket variable directing it to your node.js server IP address:

JavaScript
// var socket = io ('change this to IP node.js server');

var socket = io('http://192.168.1.9:1337');

function edisonconnect(data){
  console.log(date.name);
  socket.emit('realsense_signal',data);
}

Now for the most important step: commanding the file sample.js to create the gesture data, and running a thread to intercept that gesture data and pass it to edisonconnect.js. You don’t need to watch CPU activity because it doesn’t take much frame rate or RAM as it compiles.

JavaScript
// retrieve the fired gestures
for (g = 0; g < data.firedGestureData.length; g++){
  $('#gestures_status').text('Gesture: ' + JSON.stringify(data.firedGestureData[g]));
 
  // add script start - passing gesture data to edisonconnect.js
	edisonconnect(data.firedGestureData[g]);
  // add script end
}

After the function above is running and called to create some gesture data, the code below finishes the main task of the JavaScript* program. After that, you have to replace the realsense.js file path.

It is critical to do the following: link the socket.io and edisonconnect.js files

XML
<!DOCTYPE html>
<html>
<head>
	<title> Intel&reg; RealSense&trade; SDK JavaScript* Sample </title>
		
<script src="https://aubahn.s3.amazonaws.com/autobahnjs/latest/autobahn.min.jgz" </script>
	<script src="https://promisejs.org/polyfills/promise-6.1.0.js" </script>
	<script src="https://ajax.googleapis.com/ajax/libs/jquery/1.7.2/jquery.min.js" </script>
	<script src="https://common/JavaScript/realsense.js" </script>
	<script src="sample.js" </script>
	<script src="three.js" </script>
		
<!-- add script start -->
	<script src="https://cdn.socket.io/socket.io-1.3.5.js" </script>
	<script src="edisonconnect.js" </script>
	<!-- add script end → 

	<link rel="stylesheet" type="text/css" href="style.css">
</head>
<body>

The code is taken from SDK sample. It has been reduced in order to make the code simple and easy. The code is about to send gesture data to the server. The result is that the Intel® RealSense™ SDK was successful in understanding gesture and is ready to send it to the server.

Set Up the Server

We will use a GNU/Linux*-based server. I use an Ubuntu* server as the OS, but you can use any GNU/Linux* distribution that you familiar with. We will skip the installation server section, because related tutorials are readily found on the Internet.

Log in as a root user through SSH to configure the server.

As the server has just been installed, we need to update the repository list and upgrade the server. To do this, I will use a common command that is found on Ubuntu distribution but you can use a similar command depending on the GNU/Linux* distribution that you are using.

# apt-get update && apt-get upgrade

Once the repository list is updated, the next step is to install node.js.

# apt-get install nodejs

We also need to install npm Package Manager.

# apt-get install npm

Finally, install socket.io express from npm Package Manager.

# npm install socket.io express

Remember to create file server.js and index.html.

# touch server.js index.html

Edit the server.js file, using your favorite text editor such as vim or nano #

vim server.js

Write down this code:

JavaScript
var express   = require("express");
var app   	= express();
var port  	= 1337;

app.use(express.static(__dirname + '/'));
var io = require('socket.io').listen(app.listen(port));
console.log("Listening on port " + port);

io.on('connection', function(socket){
  'use strict';
  console.log('a user connected from ' + socket.request.connection.remoteAddress);
 
	// Check realsense signal
	socket.on('realsense_signal', function(data){
  	socket.broadcast.emit('realsense_signal',data);
  	console.log('Hand Signal: ' + data.name);
	});
  socket.on('disconnect',function(){
	console.log('user disconnected');
  });
});

var port = 1337; means that an available port has been assigned to port 1337. console.log("Listening on port " + port) ; indicates whether the data from JavaScript* has been received or not. The main code is socket.broadcast.emit('realsense_signal',data); this means the data is received and is ready to broadcast to all listening port and clients.

The last thing we need to do is to run the server.js file with node. If listening at port 1337 displays as shown in the screenshot below, you have been successful.
# node server.js

root@edison:~# node server.js<br />
Listening on port 1337<br />
events.js:85

Set up the Intel® Edison Board

The Intel® Edison SDK is easy to deploy. Refer to the following documentation:

Now it's time to put the code into the Intel® Edison board. This code connects the server and listens to any broadcast that comes from the server. It is like the code for the other server and listening step. If any gesture data is received, the Intel® Edison board triggers Digital Pins to On/Off.

Open the Intel® XDK IoT Edition and create a new project from Templates, using the DigitalWrite template, as shown in the screenshot below.

Image 6

Edit line 9 in package.json. by adding dependencies socket.io-client. If it is empty, search to find the proper installation. By adding dependencies, it will install the socket io client, if there was no client in the Intel® Edison board.

JavaScript
"dependencies": {
  "socket.io-client":"latest" // add this script
}

Find the file named main.js. You need to connect to the server in order to make sure that server is ready to listen. Next, check to see whether the gesture data name "spreadfingers" exists in that file, which will trigger Digital Pins2 and Digital Pins8 state to 1 / On and reversed.
Change the referring server IP’s addresses. If you want to change the pins, make sure you change on mraa.Gpio(selectedpins) too.

JavaScript
var mraa  = require("mraa");

var pins2 = new mraa.Gpio(2);
	pins2.dir(mraa.DIR_OUT);
    
var pins8 = new mraa.Gpio(8);
	pins8.dir(mraa.DIR_OUT);
    
var socket = require('socket.io-client')('http://192.168.1.9:1337');

socket.on('connect', function(){
  console.log('i am connected');
});

socket.on('realsense_signal', function(data){
  console.log('Hand Signal: ' + data.name);
  if(data.name=='spreadfingers'){
	pins2.write(1);
	pins8.write(1);
  } else {
	pins2.write(0);
	pins8.write(0);
  }
});

socket.on('disconnect', function(){
  console.log('i am not connected');
});

Select Install/Build, and then select Run after making sure the Intel® Edison board is connected to your computer.

Image 7

Now make sure the server is up and running, and the Intel® RealSense camera and Intel® Edison board are connected to the Internet.

Conclusion

Using Intel® RealSense™ technology, this project modified the JavaScript* framework sample script to send captured gesture data to the Node.js server. But this project is only a beginning for more to come.

This is easy to code. The server broadcasts Gesture Data to any socket client that listening. The Intel® Edison board that installed with socket.io-client is listening to the broadcast from server. Because of that, Gesture Data name spreadfingers will trigger Digital Pins change state from 1 to 0 and vice versa.

Possibilities are endless. The Intel RealSense camera is lightweight, easy to bring and use. Intel® Edison is a powerful embedded PC. If we blend and connect the Intel® Edison and the Intel® RealSense™ camera with JavaScript*, it is easy to pack, code, and build an IoT device. You can create something great yet useful.

About the Author

Aulia Faqih - Intel® Software Innovator

Intel® RealSense™ Technology Innovator based in Yogyakarta, Indonesia, currently lecturing at UIN Sunan Kalijaga Yogyakarta. Love playing with Galileo / Edison, Web and all geek things.

License

This article, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)


Written By
United States United States
You may know us for our processors. But we do so much more. Intel invents at the boundaries of technology to make amazing experiences possible for business and society, and for every person on Earth.

Harnessing the capability of the cloud, the ubiquity of the Internet of Things, the latest advances in memory and programmable solutions, and the promise of always-on 5G connectivity, Intel is disrupting industries and solving global challenges. Leading on policy, diversity, inclusion, education and sustainability, we create value for our stockholders, customers and society.
This is a Organisation

42 members

Comments and Discussions

 
-- There are no messages in this forum --