Pebble Time First Impressions

So my Pebble Time is finally here, and my first impressions will be unfortunately brief.  The build quality is far superior to that of the original!

The body itself is a beautifully textured plastic that actually feels quite nice while you're wearing it.   The bezel is a brushed metal, and the screen cover glass is now gorilla glass.   The buttons on the watch all have a nice tactile feeling to them.   They have a bit of give in them so they don't actuate immediately ( i suppose in case you bump into things or if it is lightly pressing into your hand.)  I would describe it as a similar effect to the half-press of a DSLR shutter.   When it does actuate, it has a nice solid feel.   Lastly, the band.  I actually like the rubber band of the Pebble Time a bit more than the fluoroelastomer sport band of the Apple Watch.  The Pebble's is just a bit softer, and a bit nicer feeling.  All in all the package seems much better put together than the last iteration.   It's no Apple Watch as far as construction, but it's also about half the price.

Unfortunately, that is where my impressions will have to end.  At least for now.   As I'm an iOS user,  I'm not able to actually activate and pair my Pebble Time, because the Pebble Time app is stuck in App Store purgatory at the moment.   This is more than a little frustrating, but since I sincerely doubt Apple is worried even a little bit about Pebble, I'm sure they will approve it shortly.   At that point I can do a more full review.  

For now,  not bad.   I cant wait to see how it actually WORKS.

 

This App just SCHOOLED Siri, Cortana, and Google Now

Holy crap.  You had better believe someone from Apple is going to be on the phone with these people in the morning, if they aren't already.  

From the Caption:

In this video, SoundHound Inc. Founder & CEO, Keyvan Mohajer does examples of voice queries to Hound that show speed and accuracy, and the ability to handle context, detailed criteria, and other examples. Hound can’t do everything, of course, but, for users who believe that speaking to connected devices should be like how we speak normally - this shows that it’s now possible. So many of things that you used to type, tap and swipe for can now be done effortlessly by speaking.

The technology underpinnings of Hound, all built in-house at SoundHound Inc., include the company’s Speech-to-Meaning engine. The company has also built the Houndify platform, for developers to leverage the technology and build smart, interactive voice interfaces to their own products, services, and experiences. Almost anything that is ‘connected’ can become Houndifed.

The Unseen Cost of Google's New "Free" Photo Service

Googlelogo.jpg

Is it just me, or does something not seem quite right about Google's new Photos service?

There was quite a bit of talk at Google IO this week about how Google Photos is using Machine Learning to help you to be able to search through your images.  You search for "Cronuts" and voila! Every photo that you have taken of those delectable cronuts that you stood on line hours for will show up!   From Business Insider:

Google uses machine learning to power the scary-accurate image recognition within the Photos app, and based on the demo I’ve seen, it’s incredibly fast and accurate. Other services let you search photos based on when they were taken, where they were taken, and how you’ve tagged them.

But with Photos, you can simply type in the word “pizza,” and any photos you’ve taken of pizza will pop up whether you’ve labeled the pictures or not. If you went rock climbing last summer, you can type in the phrase “rock climbing” to find those images. Google also automatically sorts photos based on these types of phrases in the Categories section. So, all of the photos of any given person will appear in one organized album, all photos of food would be in another, etc.

Google already uses machine learning — a term that refers to a type of algorithm that learns on its own without human intervention — for many of its products, including Google Now, Google Maps, and Search. But now we’re seeing it being applied to photo storage.
— http://www.businessinsider.com/how-google-created-the-worlds-smartest-photo-app-2015-6#ixzz3c3zY2qP5

Super Cool right?   On the surface, YES!   The amount of computation and engineering that has gone into a product like this is staggering.

But then I had a thought.  (Dangerous, I know.)

Why is it free?

No, seriously.   Why is it free?   Why is Google going to give away terabytes of VERY expensive cloud storage away for free?   Why is Google going to give away use of a photo recognition engine that required that massive engineering effort for free?

Answer?  They aren't.  Of course, YOU won't have to pay a thing for the service.  Google will happily index every photo you take from your smartphone,  apply it's machine learning to it,  and give you an infinite amount of storage for all of those beautiful memories.   What does Google get out of it?

I have a feeling Google is using its "machine learning" to index quite a bit more than most people think.  They say it can tell the difference between palm trees and pizza, and mountains, and people.   If it is really that intelligent will it also be able to tell, based on your photos, what kind of beer you like to drink at a summer barbecue with your friends?   How about if you prefer hot dogs or hamburgers?  Will it be able to tell based on your photos that you like to visit Disney World and go on Carnival Cruises for vacations?  I bet people buying advertising on Google would LOVE to have information like that.

Google has spent years developing applications and services to write up a profile of your online activities and persona.  Google Chrome / Google search track your web browsing habits to deliver you advertising.  Gmail algorithmically scans your email for advertising purposes.  Google Maps uses your direction requests and GPS data to deliver you advertising.  and now our last shred of real life private moments are probably being used to deliver us relevant ads.

It has gotten to the point where Google will now know more about the average Android user than his/her partner does.   The profile they have built on us contains our conversations, our web searches, our travel habits, who our friends are, and now everything we have taken a photo or video of.   Google knows what things we want to keep a memory of.   I don't know about you, but to me, that's creepy.

Tracking my online use was one thing, but somehow knowing that Google is scanning my photos and videos to identify everything in them is a step too far.   It seems today more than ever, that the services Google deliver to us are not really the products they sell.  

We are.

Crazy WebGL Water Demo!

I must have missed something.  I wasn't aware WebGL had advanced to quite this point.   This is a really cool proof of concept demo from MadebyEvan.com showing that WebGL can accurately simulate water,  as well as the behavior of lighting in water.   The fact that this runs as well as it does even on my lowly 2012 Macbook Air is super impressive.    I'm embedding a youtube video of the effect below in case your browser or computer doesn't support the effect,  but I highly suggest you check it out at the site instead.  Really, REALLY, cool.

A realtime pool of water rendered using WebGL with reflection, refraction, caustics, and ambient occlusion. The pool is simulated with a heightfield and contains a sphere that can interact with the water's surface. http://madebyevan.com/webgl-water/