Events EDA2025 esig 2024 800X100
WP_Term Object
(
    [term_id] => 20
    [name] => Samsung Foundry
    [slug] => samsung-foundry
    [term_group] => 0
    [term_taxonomy_id] => 20
    [taxonomy] => category
    [description] => 
    [parent] => 158
    [count] => 132
    [filter] => raw
    [cat_ID] => 20
    [category_count] => 132
    [category_description] => 
    [cat_name] => Samsung Foundry
    [category_nicename] => samsung-foundry
    [category_parent] => 158
)

Samsung Voice of the Body

Samsung Voice of the Body
by Paul McLellan on 05-28-2014 at 2:12 pm

 I just back from Samsung’s big announcement held at the SFJazz center (very conveniently 15 minutes walk from my place). They put a stake in the ground about their program at the intersection of medicine and health and technology. They had said in advance that they would not announce any new hardware but in fact they did…although you can’t buy one yet. They were clearly going for an Apple style announcement.

Young Sohn, who is president and chief strategy officer, opened the show, talking about how the first generation was just apps on your iPhone (oops, I mean Galaxy). The second generation is fitness devices like fitbit. And the third generation will have new wearable sensors. He pointed out that your car does a better job of monitoring its health that is possible for you today.

What they decided to do was to build an open platform for development, consisting of 3 parts which they covered in some detail:

  • Simband
  • SAMI
  • Partnerships, the biggest being IMEC and UCSF Medical Center.

The big challenge is that sensors are not quite good enough yet. Consume too much power, too large, too pricey. So they built a prototype platform. The first decision was where to put it: glasses, wrist, leg, ear, chest. They decided to go for the wrist since people are used to wearing things on their wrists even though it is not the idea location for data acquisition.


The band is impressive. It looks nice. Inside is what they call the sensor “bucket” where you can put sensors so that they are against the skin. Of course there is a display and a battery. IMEC, in addition to all its famous research on semiconductor technology, is also the world’s leading biosensor research organization. There are clearly some clever sensors with multiple wavelength LEDs that penetrate the skin and can then detect movement of blood vessels, galvanic skin response, a two probe electrocardiogram, estimation of blood pressure. But anyone can build a sensor, not necessarily Samsung, and the expectation is that some startups and other experts will do so.

IMEC designed a chip that I presume Samsung manufactured. The whole board is just 14x34mm with the chip which is 28nm, has bluetooth and wireless, and a 1GHz dual core ARM. The band can be charged with a little shuttle battery that attaches (normally while you sleep) so you never need to take the Simband off.


And it is all real, not just a plastic mockup of what it will look like one day. Live on stage Ram Fish gave a live demo. You can see his heart electrocardiogram, blood pressure, heartrate, heartrate variability and more. The signals were all changing continuously on the screen projected up from the stage.


They have worked with both UCSF and with a company called physiq to be able to generate a continuous general wellness score. physiq has had success in detecting issues in heart attack patients days before the patient feels any different.

The band is not available yet but you can start to see just how big a change this sort of technology will make towards monitoring your own health and wellness, and potentially sharing data with your doctors, truly making an era of personal medicine: you rather than people like you.

The second part of the announcement was SAMI, the Samsung Architectural Multimodal Interaction (somehow I think they picked the acronym and then worked out what it might stand for). This is a cloud based system for uploading all the raw data and then allowing sophisticated massive data algorithms to derive useful insights. You, the user, is the ownder of the data and you control granting access to services. The API’s for SAMI will be available by the end of the year a the Samsung Developer Conference. The goal is to have easy access, but full security and what they call “frictionless ingestion” which sounds more like drinking a good beer.

They showed Ram’s health going up and down over the previous week depending on whether he was awake or asleep. “Looks like he was having a stressful week, I wonder why?”

To kick start an ecosystem they have ceated a $50M investment fund for sensors, software etc. Plus I recognized several VCs in the audience. Lip-Bu was in the front-row as it happens, I’m assuming with his VC hat on rather than his Cadence CEO one.

This is clearly Samsung’s stake in the ground in what I’m sure will also have entrants from Google and Apple. They want to get out ahead both in perception and in creating an ecosystem around their technology since they are the first to admit they are not health and medical experts in all areas. It is going to be interesting to watch.

Share this post via:

Comments

There are no comments yet.

You must register or log in to view/post comments.