Tuesday, December 27, 2011

Looking back on 2011

Dear readers,

So the year 2011 is coming to an end and I have eventually failed my secret goal of having more articles than last year. The main reason being that I did not  post much in the last three months. And the explanation is just a single word : Tamias. Tamias is a pet name, a kind of chipmunk (although some folks complained that our logo looks like a mutated rat or something, hi Ed!). It could have been the name of a pet-project, but it really is the pet-name of a project we have here, at the office.

I can't really explain it all in one post, so let's say a few very basic things to keep it short and interesting.


  1. Tamias is based on Tahoe-LAFS, which makes it awesome from the start
  2. Tamias allows users sharing a mutual interest to gather together and create their own cloudish storage system by providing storage space to the distributed storage "cloud". Sorry for the buzz word, you can really replace cloud by network, p2p network, overlay network, whatever.
  3. Users from a same network exchange identity information between each others (essentially public keys) and can share files privately and securely. Privately means that someone learning the access information (what is called a capability here, think of it as a self-certifying URL) from someone else can not use it. Securely means that file blocks are stored with encryption and thus not readable by the storage node (this is granted by Tahoe-LAFS).
  4. The first beta version of Tamias was released on December 25th here.
This version works, but it is not very user-friendly, so we only expect the most computer savvy users to try it out and report errors, problems, bugs and so on. In the future, we will provide a (much much much) better UI of course, and new features like user self-introduction, user search, sharing delegation, networks federation, etc...

The Tamias website is at https://tamias.iijlab.net and has a few tutorials about compiling the software or setting up a node. As it is based on Tahoe-LAFS, it includes Tahoe-LAFS and builds upon it. However, you can not use your existing Tahoe-LAFS grid without modifications, because we added a couple of features on the storage server side, that were needed for privacy extensions.

Short post for a long story, but here's a scoop for you all : the next release will be much better and is planned for next spring...

Happy year-end holidays to you all !
Jean

Wednesday, September 21, 2011

Realtime visualization for Xbee+Accelerometer data

Here is the long awaited follow-up to the Xbee experiments that my little brother outsourced to me.

Before I get to the description of the software itself, if you want to try this out, be sure to follow the guidelines in the previous post on the same topic. You are then ready to proceed.

And here is what you get with this software :
  • Realtime visualization of all three axis
  • Can be adapted to many different accelerometers by tuning some global variables
  • Recording of samples to file, with timestamps and X, Y and Z values
  • Automatically generated plots for each axis and a combined 3-axis plot
  • Exported files named according to experiment date and time (for easier sorting)
  • All the above, in pure opensource python goodness
  • English and French localization
  • Theoretically works from within a LiveCD

Now let me briefly describe what the software does. As you can see in the screenshot on the side, the user interface is quite easy to understand. The three rectangle-shaped black boxes are showing whatever comes from the Xbee chip hooked up to the USB port, in realtime.

There are also three clickable buttons. The leftmost button labelled 'start' triggers the beginning of the recording. You will usually push this just before your experiment. The middle button named 'stop' obviously halts the recording. It also takes care of dumping all the samples into a CSV and generating the four plots : a combined plot with all 3 axis and three separate plots, one per axis. Of course the last button closes the USB port and quits the program. Beware that it does not check if you have a running experiment and will hence lose some data unless you push the stop button first.

Enough talking now, here is the python archive, all you have to do is extract the contents somewhere and run the software by launching the xviz.py script file.

I also uploaded this to the github social coding site, feel free to fork this, submit patches or whatever !

https://github.com/jean-/xviz

Thursday, September 15, 2011

Wanted: Volunteers to give a hand on next holiday

Dear readers,

Here is a quick call for volunteering.

An association I am working voluntarily is organizing a trip to Sendai area to deliver home appliances and furniture we have been gathering from the people who left Japan.

As we do not have enough hands for the whole work, we are actively recruiting people who would like to help. Here is the full text : 

Silver Week is coming, the "friends" are hiring
----------------------------------------------
by Les Amis du Tohoku on Monday, September 12, 2011 at 11:20pm

EDIT: the day has changed from 23rd to 22nd in order to avoid congestion on the Tohoku expressway on a national holiday, moreover reduced to a single lane because of (long overdue) construction work.

Dear friends, through our Facebook page you have been able to follow us and our activities, picking up furniture and home appliances around Tokyo. You have been hearing our calls for donation but were not moving and had neither furniture nor home appliances to donate.

Now is your chance to make a difference, to show that you really love Japan, and that you are a friend of the disaster-hit Tohoku area.

Come join us, come help us for the preparations to the next journey, or even the journey itself, on September 22nd 2011.

We are looking for the following kind of volunteers :
- On the day of departure, people to help clean the goods
- On the day of departure, people to help load the truck 
- From the day of departure to following evening, people to drive the truck (standard driving license required*)
- From the day of departure to following evening, people to join us with their own car and come help to unload.

Don't hesitate to contact us if you would like to find out about this trip, to volunteer,or for any other reason. by mail: jean.lorchat@gmail.com

[*]: for any inquiries about driving license, please contact us as well

----------------------------------------------


 And here is the original post. les amis du Tohoku : silver week is coming

Friday, August 12, 2011

Adding Pachube readings to any PHP website

Whew, finally got to clean up almost all the frenchisms from the code...

As promised earlier, here is a PHP code snippet to fetch data from the Pachube web service using their API. If this is the first time you are using a web service, there is something you need to know first : web services don't run for free. So even if you are paying for some service, it does not mean you can abuse it by querying for some information every second.

As any API-backed service, Pachube will require you to sign-up in order to get your API-key that can be used in your scripts. This key is identifying you and must thus not be handed around. For those who do not have an account, obviously go ahead and create one. You might want to start with a 'Basic' plan which is free, until you find that you are reaching the limits of the system and want to switch to something more beefy.

When you're done registering, you need to create an API key for use by the PHP script. One way to go is to use the master API key in your script but this is not such a great idea. Instead, create a "secure sharing key" so that if someone gets to see your code, he won't be able to do much to your existing feeds and so on. You can find these options on the right-hand side menu of the Pachube website, after you have logged-in. It is called "My API keys" and resides withing the "My account" section. Now that you have created some key with at least the 'GET' permission, write it down, we'll use it later.


Now the other thing to take into account is, you only have a limited number of queries available over time. Depending on the web service provider, the type of account (free vs premium for example) and other likely parameters, you can find out this number. In the case of the Pachube Basic account, the limit is 5 requests per minute. I wanted to update some Geiger counter reading every 5 minutes, so this is plenty enough, great ! Here is how it looks on our PTA web site, pretty cool huh ?

How we are going to deal with this in the code is by caching the web server response. The example I am showing here is very crude, but at least it works and respects the web service netiquette by not flooding the server with superfluous requests.

Another final word about web services : JSON. This stands for JavaScript Object Notation. It used to be a very convenient way for service to return data, because you would just get a bit JS code that once evaluated would have populated your namespace with the data you had requested. Obviously, evaluating data returned by a service that can potentially host data created by malicious users is never a good idea. Well, we are doing PHP anyway so we don't care, plus PHP provides json parsing functions.

Now let's have a look at the code :


$json = file_get_contents('json.txt');


$flux = json_decode($json);

$right_now = new DateTime();
$seconds_now = $right_now->format('U');
$json_cache_date = new DateTime($flux->datastreams[0]->at);
$seconds_then = $json_cache_date->format('U');


What we have here is pretty simple. We open the cached json data (once again, crude example, this should probably be stored in a better place like your website database, and named in a more multi-feeds friendly way) and extract the information about when this data was returned. This is specific to Pachube's data format, other web services
might store this information in another place/way. We also generate information about current time, for comparison. Both times are expressed in epoch unit, which is a number of secondes elapsed since a specific event (guess which :P).


if ( (($seconds_now - $seconds_then) > 300) || (property_exists($flux, 'errors')) )

{ /* fetch and cache JSON data from feed every 5 minutes */
$request = curl_init();
curl_setopt($request, CURLOPT_URL, "http://api.pachube.com/v2/feeds/26485.json");
curl_setopt($request, CURLOPT_HTTPHEADER, array("X-PachubeApiKey: your_own_key") );
curl_setopt($request, CURLOPT_RETURNTRANSFER, TRUE);
curl_setopt($request, CURLOPT_BINARYTRANSFER, TRUE);
$json = curl_exec($request);


Now we have some check about whether the data should be fetched again or not. The interval I chose is 5 minutes because it does not really make sense to get the radiation reading in realtime except for some specific events from March. If the data is stale, we build the web service request using the curl library. Note the line that adds a header with your API key. This is where you should put the information you wrote down earlier. Don't forget the space between the double-colon and the start of your key. When the request is built, we throw it at the Pachube API server and get ready to handle the result.


if (($json !== FALSE) && (curl_getinfo($request, CURLINFO_HTTP_CODE) < 400)) 

{
$flux = json_decode($json);
if ($flux !== FALSE)
file_put_contents('json.txt', $json);
}
}


Several things can happen here. The server can return an error, it sometimes happen at night for some reasons I could not find out because I was sleeping... Or the fetching can fail if you have network connectivity problems, etc... So we check that we got an answer from the API server and that this answer was a real json object. In which case we cache it and decode it.


if ($flux !== FALSE)

{
$date = new DateTime($flux->datastreams[0]->at);
$date->setTimeZone(new DateTimeZone("Asia/Tokyo"));
echo 'On : ' . $date->format('Y-m-d') .' at '. $date->format('H:i:s');
echo '<br>';
echo 'value was: ' . $flux->datastreams[0]->current_value;
echo ' ' . $flux->datastreams[0]->unit->label;
}


Whether we got the information from the cache or from the API web server, there are cases where the data has errors, in which case you do not want to show it. But if we got the requested information, all that's left to do is parse it and push it to the user in a human readable way. Et voila !

Accelerometer + Xbee = geekish fun



Hi there, I hope you are all having a nice holiday season.
As for me, I am keeping quite busy despite the loneliness and am desperately trying to do my job, some volunteer help for the Tohoku area, turning the website for our PTA upside-down, transforming a two storeys (stories?) open room type house in a 3+LDK, and helping out my lil brother on his science assignment. And there's today's story.

So introduce my brother, who has been playing with water rockets since he was in high school. Now for his first year at university he managed to get himself a science assignment dealing with those

very things. I won't explain the principle at hand here because I already have lots of things to say, so look up wikipedia (I suggest reading the discussion page there too, it is abysmal in epic proportions) or check out this NASA picture.

Now you can imagine that you can do lots of fun stuff with your own rocket, especially when you learn that the record height achieved is around 600 meters... Well, a 2L soda bottle won't go that high, but it is still a very interesting way to conduct extreme experiments around embedded/autonomous systems.

When I was around the parents' home back in France, he tells me about his latest experiments
and how they thought they wanted to get telemetry readings for the system. That sounded cool, and I was thinking about my co-worker here who does all those neat things with arduinos, xbees, sensors and so on...

Actually, my brother and his friends' main skills are neither geekygadgetry nor computer science, so the final picture is not as complex as I imagined, but still beyond their reach. Based on a survey around his friends and professors, he gathered that Xbees are cool and can work in an autonomous way without any microcontroller, which is good when you don't have a clue. They are small chips (see pic on the right) that do 802.15.4, present a serial emulation to the user, feature 5 analog inputs and an associated ADC, and have a decent (depending on your own definition of decent) sampling frequency.

Well, to be honest he actually told me about this all on the phone before I came back, so that I could do a small shopping trip around akiba and bring an accelerometer. There it is : KXM52-1050, straight from Akizukidenki, behind the now defunct Yamagiwa/Livina (just saw they were doing some construction work there, something happening soon I guess). So this little chip eats 3.3V which is all great because most of the boards that can host the Xbee provide a 3.3v regulator to feed the sensors.

Now here's his plan : one Xbee on the rocket, with some form of power supply, hosted on a regulated board, where to hook the accelerometer sensor as well. On the receiving side, another Xbee (using just one doesn't make sense, does it ?) on a USB board plugged directly into a computer, and voila. It sounded so EASY.

Now that I look back on the whole process, it was. But for some reasons, the Xbee manufacturers went through great pain to write a very detailed yet completely useless documentation. All the information needed is inside, but after reading the whole 68 pages, I felt like I did not know what to do. Also there has been several versions of the chip, things changed, and there is no clear history of what went and what didn't. One important thing I learned at my expense is that the ADC used to do the conversion on the fixed 0~1.2V range only. But, in the latest document there is no mention about the voltage range. There is a two-line section about ADC that states that "Xbee supports ADC and here is how to enable it". Period. After lots of digging I found out that the "Pin signals" table stated that Pin14 is for "Voltage reference for A/D input". If the 68 pages were a haystack, this line would be the needle.

And then, the plan worked ! That is, until I saw the "software" that some friend gave to him in order to do the Xbee monitoring. If you can call a VB 20-liner a software, that is. Which is way I made another one for him that does .csv export and much more. More on it in another article but meanwhile, enjoy the output. Guess which line's the Z ?


PS: I know I am getting behind schedule with the follow-up to the pachube article. It is still in the pipeline...

Tuesday, June 7, 2011

The internet of things is good for your health


At some point, the "Internet of Things" (IoT) was a quite popular topic. I guess they lost the buzz war against the clouds but yeah, it's a fight you'll never win... However, the whispering shadows of sensor mania and IoT can still be heard if you listen closely. And, IoT can actually be good for your health !


As a concept, IoT is as loosely defined as "cloud", but what did you expect from a buzz war contender anyway ? For this article we will focus on an instance of a very specific IoT concept and platform, called pachube. In just one sentence, Pachube is a web service that provides centralized hosting for sensor data. As most web services it is rate limited, and offers different limits based on the premium you pay. There is also a free service.

On the other hand, I guess that by now most of the people reading this have learned about the twin disaster that hit Japan, followed by both an unlikely nuclear meltdown and unfortunate radioactive leaks in the air, soil and water. In this context, people suddenly realized about Geiger counters and they quickly sold out. A few brave people put theirs on the web, through various means (digitizer software, webcams, etc...).

Now this is the kind of Thing that can go into the Internet of Things. So a few of the people that had digital probes or probes with digitizer software set up a pachube account to host measurement data. Then someone stepped forward to put together a user interface for all those values and plot near-realtime measurements on a map. Here it is : http://japan.failedrobot.com



Next I will show you how to get some values from pachube to put on display in your own webpage, so stay tuned...

Monday, April 18, 2011

User interfaces gone fuzzy, a case against new interaction devices

So this is a rant about all the trendy new interaction devices. In fact, not so much against the devices themselves, but the way people keep trying to use them to emulate old things.

Our knowledge, our science, our tools have been built in an incremental way. Of course, who would invent a spanner when nuts and bolts don't exist ? And even though sometimes, someone has a great idea that is very far-fetched, it also often meets with doubts and perplexity. This is how we just avoided getting computers 100 years early, which might have bumped human technology on a very different path.

And so this is how after getting computers and monitors, people feeling the urge to interact with those beasts soon made out the keyboard, and then the mouse. For a very long time keyboard and mouse have reigned supreme over the world of user interfaces.

Hypertext Editing System. Original photo by Greg LloydHypertext Editing System.
Original photo by Greg Lloyd
source: wikipedia

Of course there had been attempts at alternative ways, but these interfaces never took off and were doomed to stay a niche market. Probably a few schoolboys and girls from the 80s in France remember the Thomson light pen ? And there have been graphical tablets for decades now, and people using 3DVR stations have those gloves, and videogames have their own things filled with buttons.

But we all know the latest trend that seems to take off for real, is touch screens. This is not the dawn of touch interfaces, it has been pioneered for long by the graphics industry. Despite a few attempts at PDAs and smartphones (P900 anyone ?) that were expensive toys, the widespread use of touch devices has more likely begun with the nintendoDS and then the iPhone.

So what's wrong with touch devices ? I would say feedback. When you learnt to use a keyboard for so long, feedback is what makes you know you mistyped even before seeing it. This does not happen with touchscreen emulated keyboards. Now worse, the software thinks it is smart enough to guess what you intended to type, even in some situations where you did *NOT* make a mistake.

This is all because we want to use touchscreens the way we used keyboards. But touchscreens will never be keyboards, they are screens ! Kids like to look at pictures and slide them to see the next one. This is intuitive and cool. Probably the kids born around now will never have to use a real keyboard and won't miss it. But for me, as much as the mouse supplements the keyboard but can't substitute to it, so much can be said about the touchscreen. How many people use mouse emulation with the numpad ?

Don't type technical words !

This all-touch trend, I call it "fuzzy interface". And by the way, I find it much more straining to type on the touchscreen because you have to check continuously that no error has been made, either by you OR the software. Here is an anecdote to close this case, redacted from a Google Summer of Code mailing list for mentors :

. Random person says ''company XYZ can help assess candidate students abilities blablabla....''
. Person in charge from Google answers ''actually I am hot interested in other companies reviewing our students..."
... 8hrs later, same Google person : ''Not, not hot. Touch screen keyboard :-)"

Happy touching !

Thursday, January 6, 2011

Best wishes for 2011

Hi dear readers, and welcome to the year 2011 (of the cute bunny) that should hopefully be more peaceful than year 2010 (of the tiger).

As some people might know, my hobbies include (a lot of) cooking so it is with a recipe that I am going to kick off the new year. All ingredients can be bought here in Japan, so you can probably cook this one almost everywhere in the world.