Not interested in the explanation? Skip to the demonstration to see it working
The example uses jQuery to fetch JSON using YQL which in turn gets the data from Microformats enabled pages. Three things and lots of clever brains joining together to make semantic data goodness. If you’d like to look through the source you can download it or fork it from Github.
YQL is a neat implementation by Yahoo! that provides a layer over selected APIs and data from across the web. For most web developers the syntax will be familiar as it is pretty much standard SQL. It returns either XML or JSON and removes the pain of interrogating and parsing API data. Recently it was announced that YQL would support Microformats. Great - so developers now have easy access to Microformats data without having to build complicated parsers.
Well the possibilities are endless. That’s the power of the semantic web. But for this example I’m going to suppose that Microformats really are a way to solve the problem of having your personal information distributed around the web in various locations. Being distributed is not really a problem but when the only authority for the information is an individual human it is. Having to change instances of your information manually becomes a maintenance headache - your data can easily be forgotten and go out of date.
Why not instead use a Microformats enabled HTML page that can be interrogated by a parser (in this instance YQL) to update or populate information? Much like DNS nameservers but for personal information. So when I want to fill out a form I can enter a URL, hit a button and the work is done. The information is stored in one place, and is maintained by the authority (me) and then distributed around the internet by machines. Hurrah - the robots are taking over.
Given that personal data is tied to the unique identifier of a URL it doesn’t really matter where information is stored. It might be my last.fm account, my facebook page or my personal website. Finally personal data can be stored in one place.
Granted this model needs an additional layer to say where the authority lies. On forms a human (me) can physically put the URL in, but there is no reason why a DNS model couldn’t work for this. If we can embrace open standards and a model that works well for domain names we could have a workable system for personal information.
Yes - there are issues of standards, privacy, security, and permissions but some clever people are already tackling those. If I don’t have to put my details into every single site, every single time I’ll be happy. We are getting there…
You can see the demonstration or fork the code from GitHub.
Linux and Unix fc command tutorial with examples
Tutorial on using fc, a UNIX and Linux command for editing and re-executing commands previously entered into an interactive shell. Examples of editing and re-executing the last command, editing and executing a previous command, setting the text editor to be used, listing previous commands and executing a command without editing it.
Linux and Unix cal command tutorial with examples
Tutorial on using cal, a UNIX and Linux command for displaying a calendar in the console. Examples of displaying a single month, multiple months, showing week numbers, Julian dates and arbitrary dates passed as arguments.
Linux and Unix du command tutorial with examples
Tutorial on using du, a UNIX and Linux command for estimating file space usage. Examples of showing a disk usage summary, outputting a human readable format, showing the size of a directory and showing the ten largest files or folders on a system.