Reading this post about the Windows 7 interop samples, all I can think is “uh-oh, it’s Indexing Service all over again”.
Microsoft invented all the components for successful desktop search back in the year 2000. But they didn’t think about scenarios, and then Apple came with the same concept done right and got all the accolades. And now with the Sensor and Location APIs, Microsoft is continuing to use their old play book instead of learning and improving.
The following sentence tells it all:
For example, the following image shows an updated version of the MSDN Reader. This version of the MSDN Reader changes the way the application looks depending upon the amount of light the Ambient Light Sensor detects.
Changing the app colors based on the ambient light? Really? Is that the best you can do? Where is your imagination?
Why not make the applications smarter? Fix your fucking Bluetooth stack and drivers and make my computer know if I’m near it based on the location of my phone. The use that to (maybe) lock and unlock my computer, set the sound level for various notifications, and do other things I need.
And use your position to bend the manufacturers’ arms into adding more sensors to their laptops. I want microphone arrays. I want high-quality webcams. I want an RFID tag on my wristwatch. I want my computer and my phone to remind me to take out the trash when leaving the house because they figured out that my Wi-Fi signal is getting weaker.
I know that asking manufacturers to spend those extra $5 on a $2000 computer is tough when most of them don’t even want to spend an extra buck on a backlit keyboard, but without the sensors your sensor API is useless.