it features a new javascript virtual machine, build by v8, a Danish company
the ‘omnibox’ (cfr. the ‘awesomebar‘ in Firefox) is located on the tab-level instead of the window and is thightly integrated with (you guessed it) google
a new tab shows you your 9 most visited sites and your 3 most uses search-engines (a bit like Opera Speeddial)
it is not clear if Google used Mozilla’s XUL/chrome to build the UI elements, but the name might be an indication that they did and the comic does state that Google “owes a great debt to other open source browser projects, especially Mozilla and Webkit”, so …
Looks very interesting, i’ll download is as soon as it’s available later today. But I’m curious what the Mozilla-guys think of what must be a double dent in their ego with a friend gone foo (well, to a certain extent) and with Google not using Mozilla’s Gecko as html-rendering engine. Update; a screenshot of the new browser:
Although browsers clearly have become better, faster and stronger (I doubt they’ve become “harder” as well), it sometimes seems as if no revolutions have taken place apart from the introduction of XMLHttpRequest by Microsoft back in 2000. But this morning I saw something that really blew my mind and the live mashup of that great Daft Punk song perfectly describes the mood I’m in since. The reason for all this excitement is a prototype of new functionality in Firefox that redefines how you can interact with websites and -applications, allowing you to use the web more efficiently. Just watch this video to see what I’m raving about (skip the first 50 seconds to see the actual goods);
Ubiquity, as the 0.1 Firefox add-on is called, is the work of a group of smart people at Mozilla Labs, headed by Aza Raskin. Aza is the guy behind Humanized, the company that developed Enso, a merger of a GUI and a CLI leveraging the power of language in a graphical user interface. Aza and a number of his co-workers joined Mozilla at the beginning of 2008 and they’ve already produced some innovativeideas over the last few months. Ubiquity is past that initial idea-stage, with a prototype that really builds on the great idea’s Aza and his Humanized co-workers had with regards to the power of language in a UI. I’ll bet you this will be the way to disclose and use microformats in Firefox as well (breaking the deadlock the microformat-guys were in). Even though it’s still in alpha/ prototype phase, this is the Future guys and it works! Now try it out, will ya!!
With the nineties browser wars and the quasi MSIE monopoly that followed after the Netscape debacle behind us, the desktop browser scene can be considered a mature market, with some verygoodproducts vying for our approval. Time to shift our attention to the next battleground; mobile browsers. Netfront and Pocket Internet Explorer dominated this emerging market for quite some time, but as of late some newcomers are making great advances in this area. And apart from Opera Mobile and Mini (the Mozilla-guys are really ages behind here), these all share the same open source core; WebKit. The history of WebKit in 10 1/2 sentences WebKit is a fork of KHTML, the html rendering-engine that was developed by the KDE-community for its Konquerer-browser. In 2002 Apple decided to build it’s own browser based on KHTML and thus WebKit was born as the core-component of what would become Safari. Since it’s inception, WebKit has gained enourmous momentum; Safari now has a market share of approx 6% on the desktop, but smaller projects such as iCab and Epiphany (the Gnome browser!) picked up WebKit as well. But there’s more; Adobe decided to incorporate it in Air (the Flex-like platform for building desktop-software). And Trolltech, the company behind the Qt GUI-toolkit and one of the primary backers of KDE, announced they would include Webkit in Qt 4.4 as well. WebKit 0wnz Mobile But the mobile area is where WebKit is really taking the world by storm; it not only powers the mobile version of Safari on the iPhone and the iPod Touch, but WebKit (in its S60webkit form) it’s also the basis of Symbian’s S60-browser. Nokia ‘s Mini Map Browser, as it’s officially named, was first released in november 2005 and thanks to the succces of Symbian it’s probably the most widespread mobile browser by far. Being a proud Nokia e61i-owner myself, I can testify that it is a great browser indeed; I didn’t even bother with installing Opera Mini (which I used instead of Netfront on my Sony-Ericsson w810). Next to these two well-established WebKit-derivatives, the lesser known Iris (for Windows Mobile), newcomer Digia (for Symbian UIQ-devices) and last but not least the browser of Google’s highly anticipated mobile Android OS are also part of the family. Mobile Web, but there’s more then One So thanks to KDE’s great job on KHTML and Apple’s (and Nokia’s) subsequent work, we are at a point where users of ‘smartphones’ and similar devices can access the internet almost as if they were using a desktop-browser. But screen-size, text-input, data transfer (bandwidth and price) and context remain very different from normal browsing, so don’t believe the “one web”-hype just yet. But still; these sure are great web times for building mobile(-ready) websites and -applications!
Spam headlines sure make for an interesting read nowadays;
For a split second they succeeded in getting my attention and I almost opened some of these mails on mere impuls. A good thing they were already classified as spam.
(*) It won’t work on other mobile devices either; Flash Lite, which ships on e.g. Symbian and Windows Mobile powered devices, is not able to display those millions of fancy animations out there on the WWW either.
So I bought a 2nd hand Nokia e61i which had a messed up keyboard configuration. Symbian OS does not allow you to change your keyboard settings as I had hoped for somewhat naively. The configuration is ‘hardcoded’ in the firmware and cannot be changed officially except by Nokia Service personnel. A good thing there’s Google, a nice little hacker tool and the Nokia Software Update utility. These are the steps I followed to flash my Nokia with the correct firmware (only possible under MS Windows XP or Vista afaik);
Backup your phone‘s data using e.g. the Nokia PC suite (this will not back the old firmware, only your data)
Press *#0000# on your phone and write down the firmware info you see, in my case this was:
2.0633.65.01
02-10-07
RM-227
nokia e61i-1
Check the product code of your phone (underneath the battery) and write that down. In my case this was “0542890”
Go to this page to find alternative product codes for your phone, crosschecking with the info from (2) and (3). I decided I needed “0538563 EURO A Mocha/Silver” (which has QWERTY) instead of the current “0542890 EURO D French Mocha/Silver” (which has AZERTY)
Think twice before proceeding, the steps below may cause permanent damage to your phone and may void your warranty! You have been warned!
So you’re sure you want to proceed? OK;
Make sure the USB connection between your PC and phone can remain in place for the next 30 minutes or so (no cats or children that might want to play with that USB-cable). The USB-connection is your phone’s lifeline, if it gets cut during the upgrade, your phone dies (well, kinda).
Fire up Nemesis:
click on the right top button (with the magnifying glass) to scan for a new device
click on the 2nd icon labeled “Phone Info“
in the “Production data edit” pane check the box next to “product code” and press “read“. The value there should match the product code you wrote down earlier
replace the product code with the one you think you need (cfr. step 4) and press “write” (and do a “read” again to make sure the value is correct).
close Nemesis
Start Nokia Software Updater (can be done from within the PC Suite).
NSU actually is a pretty straightforward wizard that will guide you through the upgrade process. You will be warned several times about the dangers of flashing your phone, but by this step you should know what you are doing, no?
During the upgrade, your phone will restart several times, you’ll hear Windows play the sound to indicate USB-devices are plugged out/in. Don’t worry, this is normal.
Close NSU when it says it’s ready
Disconnect the USB-cable
Check your phone’s firmware information by pressing *#0000# again. In my case this was
3.0633.69.00
06-02-08
RM-227
nokia e61i-1
So there you have it, not only was my keyboard mapping problem solved, I also got a free upgrade to the latest Nokia firmware. Qnd there zqs much rejoicing! 😉
WordPress 2.6 has been pushed out the door at Automattic and it contains some exiting new goodies as usual. So I fired up my trusty upgrade script, but got an ugly php-error when accessing the database update-pages:
Parse error: syntax error, unexpected T_SL in wp-includes/widgets.php on line 464
Turns out that the wp_widget_search-function in wp-includes/widgets.php included some remnants of an SVN-merge. Don’t know if it was a sync problem at my side or if the faulty code was on the SVN-server (it isn’t now), but I ended up copy/pasting the correct function from a fresh tar-ball I downloaded.
With new versions of our trustedbrowsers coming out, web developers who like living on the edge can start using some of the new features that are becoming available. One such goody is cross-document messaging, which is part of the HTML5 draft spec. Cross-document messaging allows children of a window (think iframes, popups, …) to communicate using JavaScript, even if they originate from a different domain. This means that Instead of just iframing an external application, without being able to integrate further, your page can send and receive messages to/ from it. PostMessage could even be used to do cross-domain XHR (a hidden iframe on the same domain as a a remote datasource can be used as a proxy to perform XmlHttpRequests on that remote domain) untill the real thing hits the streets as well. The two additions that allow you to perform such messaging, are window.postMessage and an eventlistener for events of the “message” type to handle the message. A pretty straightforward example of this can be found on JQuery’s John Resig’s site (see also his lastest blog entry about postMessage). As cross-domain javascript can be a potential big security risk, taking into account some precautions is really really really really really necessary. Really! On the downside (as if security is not a problem); this brand new feature is only available in Firefox 3 for now. My own little test (a copy of John Resig’s example with some minor tweaks) worked in Opera 9.2x (and 9.5b) as well, but postMessage seems to have been dropped from the final Opera 9.5, as the tests on Opera Labs don’t seem to work any more either. Support for postMessage is also available in Webkit (Safari‘s backbone) nightly builds and in Microsoft’s IE 8 BETA (with the event being ‘onmessage’ instead of ‘message’ and some other quirks but hey, this is beta, no?). So expect postMessage to be available in all major browsers by the end of the year. But why wait if you know that Facebook is already using postMessage in their chat application. I wonder what they fall back to if it is not available though …
[UPDATE june 2009: this is solved in WordPress 2.8] Having a fair amount of experience with WordPress installations and configuration, I wanted to install trusty old WP 2.5.1 on an idle desktop (winXP+xampp) at work to do some blogging on our intranet. The installation itself went smoothly (how hard can unpacking a zip-file be) but after some time the damn thing stopped working, producing nasty timeout-errors caused by a.o. wp-includes/update.php and wp-admin/includes/update.php. The problem is that WordPress tries to open an internet-connection (using fsockopen) to see if updates are available. Great, except when you’re trying to run WordPress on an intranet behind a proxy without a (direct) connection to the internet. After some unsuccessful fiddling in multiple WordPress php-files, I ended up disabling fsockopen in php.ini (disable_functions)!
Disabling! Fsockopen! In php.ini! Just to have a working WP?
I mean, come on guys, why doesn’t WordPress provide configuration options where you can specify if and how (what type of proxy, what address to find it on, …) it should try to connect to the internet? I even made this truly amazing UI mock-up which you guys can just like copy/paste straight into your code;
How should WordPress connect to the internet to check for updates?
(*) Direct connection to the internet (default)
( ) Use a proxy:
Proxy type: (*) http ( ) socks
   Proxy URL: ___________________________________________
   Proxy User: ___________________________________________ (optional)
   Proxy Password: ___________________________________________ (optional)
( ) No internet connection available (WordPress won't be able
to warn you about updates!)
________________________________________________________________________________
I’ll be at WebScene 2008 today and if all goes well, I’ll be bringing you live updates of the event (as I did last year). So watch this space if you’re interested! Being the commuter I am I took the train to Asse and rode my bike from Asse to Affligem (passing Asbeek and Edingen, very nice!) to arrive here at 9h00. So I’m at the conference center, scored Wifi-access and I’m ready to watch and learn. Bart Van Herreweghe (blog) kicked off with a talk about the new Belgium.be. The Kanselarij van de Eerste Minister worked together with Fedict for the production of the new portal, which was build by a multitude of companies such as IBM, Amplexor, Panoptic, Netway and Internet Architects. Because of the large amount of information that is published on the portal, Internet Architects and Netway played a very important role in information and user-centric interface design, introducing the idea of “doormat”-navigation which could be compared to a (part of a) sitemap being displayed on a (theme-)homepage. Technology-wise, belgium.be uses Tridion as WCMS with templates that contain validated XHTML, with a strong focus on accessibility which aims at Anysurfer plus compliance. The search-module, which will spider a great number of federal websites, is based on Lucene and developed by Panoptic (Cronos) with LBi. Panoptic’s Ines Vanlangendonck (blog) talked about the importance of usable web content management. Choosing a good foundation (WCM product) and customizing it to the (backend) users’ needs (e.g. adding or removing DAM-functionality, rich text editor functionality, online translation, …) should help get your users (content-owners, editors, …) on board. Looking at the poor adoption rate of the web content management tool chosen at a certain telco company a few years ago, she couldn’t be more spot-on. Ex-colleague Philip Achten from The Referencepresented the implementation of the new Thomas Cook-website. This travel website is an e-commerce business platform first and foremost, with on average 15000 unique visitors/day in 2007 and an estimated growth of 50% in 2008. One of the main goals of the new website was to allow the content team (15 people) and the travelling reporters to manage web-content decentralized. The Reference implemented Sitecore 5.3 for this purpose, a powerfull Microsoft ASP.NET-based WCM-solution, deployed on a loadbalanced environment (2 webservers with IIS and 1 MS SQL databasesserver). Next to the pure content management, a number of applications have been build like the destination search, newsletter, user registration and personalisation and off course the crucial booking application (connection to backend booking engine). In a next phase, building on the user authentication application, user generated content functionality will be added allowing regsitered visitors to add text, pictures and video. Ektron‘s Norman Graves held a talk titled “Key Technologies and how they impact some real world examples”. He talked about taxonomy and how it’s used in search, geomapping, personalisation in Ektron CMS 400.NET. Lunchtime has come and gone, time for the afternoon tracks. I started with the presentation about Arte+7, the Arte mediaportal. The website and presentation were done by CoreMedia, who also provided the CMS and DRM-infrastructure. Video’s are published in FLV and WMV-formats, with geolocalisation to limit the countries from which one can watch the content. The same technology is also used in the Arte VOD-site, for which Arte+7 is a teaser. Kinda nice, but lots of javascript and flash in that site, not really accessible. For the 2nd session I moved to track 5, where U-sentric‘s Tara Schrimpton-Smith talked about “Guerilla Usability Tests? User testing on a shoestring”. Her advise: use friends of friends, somewhere between 2 and 5 users (with 2 testers you should be able to find 50% of usabiltiy issues, with 5 users 85%) and limit the amount of tasks you’ll be testing. She concluded the session with a live example, someone shouted the name of her website, someone else volunteered and the task was ‘what is the address of the headquarters’. Judging the time it took the testperson to find this information, there are some usability issues on barry-callebaut.com. A fun session! Next up; Robin Wauters (blog) about “Social media is not an option”. Not much stuff to learn here (Robiin talked about technorati, attentio, involve ‘influential bloggers’, blog to showcase knowledge, “dell hell”, buzz, virals, …), but it’s nice to be able to put a face on the guy behind plugg and edentity. And we’ll finish off with AGConsult‘s Karl Gilis with “9 tips to help users find what they’re looking for on your website”. So let’s create an ordered list for that purpose:
ensure the accessibility of your site (should work on all common browsers/os’es, don’t misuse technology, make sure Google can crawl your site)
speed up page load times, the user decides in half a second if (s)he’ll stay or not
make navigation easy to use (structure, terminology, placement)
provide clear overview pages (example; belgium.be and it’s doormats)
your search should be as good as google (depends on technology and content!)
use an intuitive page lay-out
make your text legible (Verdana 10pt, Arial if you’re adventurous)
write for the web
make sure the info is there (do user needs analysis)
A fun session as well, those usability-guys and girls know how to entertain! My conclusion: this was not an uninteresting day, but the focus was clearly less technical then previous year’s edition. Content Management -around which much of this event was focused- is slowly but surely becoming a commodity and vendors are having a hard time differentiating themselves from their competitors. It is my feeling that the bigger changes and challenges with regards to “the web” are more on the application-front, where backend-integration (SOA, webservices, …) and RIA’s (using ajax, GWT, flex, …) are today’s hot topics. The fact that webscene2008 did not explore these new frontiers (and their implications with regards to business, marketing, usability, accesability) is a missed opportunity really. Let’s hope they reconnect with the webtech-trends next year! And maybe I’ll be there to let you know?