Samsung C-Lab's gee-whiz results to draw looks in Vegas
Skunkworks is the tag given to big companies that let creative thinkers have their freedom to work on projects, away from the others who put their heads down to satisfy product launch dates.
Samsung has good reason to plump up its reputation as an innovators—world publicity— and it is seizing the opportunity of CES 2020 that starts January 7 to just that, showing off gee-whiz skunkworks innovations.
SlashGear remarked that the company's C-Lab is its skunkworks arm that allows it to fund those wild ideas without affecting its main products and turn them into profitable businesses afterward.
As Samsung's C-Lab is sending its project showcase to Vegas, the company's news arm is only too happy to reveal what is to come. On Sunday, Samsung Electronics announced that it will show projects from its C-Lab Inside program and products from startups participating in its C-Lab Outside program.
C-Lab Inside is an in-house idea incubation program. In October 2018, C-Lab Outside began as a startup acceleration program; the start-ups are provided with financial support, business collaboration, and opportunities to participate in global IT exhibitions alongside Samsung.
You will be hearing a lot about SelfieType, an Inside project.
Consider a virtual keyboard using a front-facing selfie camera adaptable to smartphones, tablets and laptops. How SelfieType works: An AI engine analyzes finger movements coming from the front camera, and converts them into QWERTY keyboard inputs.
SlashGear's Ewdison Then: He thinks it is an odd spin on a rather questionable idea. In using a Galaxy phone's front camera instead of external Bluetooth device, said Then, one might have a problem: "You'll practically be typing blind since there's no keyboard projected on the surface."
Typing blind did not seem to put off Android Authority's Hadlee Simons who thought the SelfieType entry "might be the most interesting project on-hand in Las Vegas."
But even Simons recognized that having no keyboard projection to work with means accurate typing might be a problem if the algorithm isn't accurate. "Hopefully the concept includes some sort of cursor or on-screen indication of what you're about to type."
One Android Authority reader comment said typing blind might not be such a deal breaker. The reader said "I just tried 'typing' on a flat surface just to see what it'd feel like, and though it is really awkward with no tactile feedback it does seem like something I could eventually get used to."
He said the biggest issue would not be typing that way but getting it right. "I don't feel like my typing is very accurate that way, so whatever AI system they're using to detect what you're typing will have to be very good at not only tracking your hands, but also at correctly guessing what key you intended to press, even if in reality your finger was a bit off."
You will hearing about Hyler, the smart highlighter. From paper, text is turned over to mobile devices via this smart highlighter.
Becon meanwhile is a combination handheld and app that will let you check to see how well your scalp is doing. This is a diagnostic device that analyzes hair follicle density, dead skin, temperature and humidity based on a machine learning algorithm. Next step: It recommends the most suitable solution
SunnySide is a lighting device—think artificial sunlight. It's a device that you install on your wall, as you would a picture frame. SunnySide is shaped like a little window. Samsung said it "enables the user to enjoy the sunlight that changes by the hour by copying the full spectrum of the actual sunlight. It also helps users synthesize vitamin D from indoors."
Ultra V is for skin and vitamin D monitoring. Samsung described it as a "sensor and service that records ultraviolet rays daily." You integrate it into your wearable device. You monitor and manage your skin condition and Vitamin D production "influenced by accumulated exposure to solar UV rays."
A robot from Circulus is due from the C-Lab Outside, and the robot is called piBo. Samsung has positioned it as a little robot suited for single-person households. Simple conversation, check. News and weather, check. Companionship, check. "It interacts with users based on emotional analysis of facial expressions and contents of conversations and gives appropriate responses with sayings, music and dance." There is even a robot app store to download new features and content.
'FITT' is a healthcare data platform based on exercise tests. After cardiorespiratory, posture and muscle strength tests, users get a personalized exercise program regarding health status compared with other people of the user's age and gender.
VTouch can control devices without touching them, which may appeal to people who think about keeping displays clean. How it works: Computer vision technologies kick in to track the user's eyes and fingertips.
Another feature will be Smoothy, an application for 8-person video chats. It provides video calls using Samsung AR Emojis. The AR Emoji can mirror the user's facial expression and motions in real-time.
The curtain-raisers will stay on the Samsung track with another kind of labs show-stopper. Samsung will show its artificial human called Neon. This time the producer is STAR Labs (stands for Samsung Technology & Advanced Research Labs), described as an independent entity of Samsung Electronics.
Neon is getting an OTT dollop of preview drama, complete with clock countdown of days, hours, minutes and seconds until the robot makes its appearance in Vegas.
The Korea Herald on Monday: "It was developed by Samsung Technology and Advanced Research Lab, based in the United States, under the leadership of the unit's President Pranav Mistry...Neon is likely to be introduced as a new AI platform of Samsung."
The Korea Herald report by Song Su-hyun quoted "an industry official," who remarked that global tech giants like Samsung were racing to create "something that can be called AI assistants that are like real humans, beyond the current device-based platforms."
© 2019 Science X Network