Let your wearable data "speak" for itself! We're introducing SensorLM, a family of sensor-language foundation models trained on ~60 million hours of data. SensorLM learns to interpret high-dimensional sensor signals, enabling robust wearable data understanding with natural language. Blog post: https://goo.gle/4lSLwQi Paper: https://lnkd.in/gVCf4S5i
V good but can someone help me understand, how this would be better than querying a db with say, a language model?
This is awesome! Thanks for sharing it!
Really cool!
Definitely worth reading
Interesting work with 𝗦𝗲𝗻𝘀𝗼𝗿𝗟𝗠 and the vast training set, but there’s still one vital signal that even the best 𝘄𝗲𝗮𝗿𝗮𝗯𝗹𝗲𝘀 𝗺𝗶𝘀𝘀: 𝘁𝗵𝗲 𝗕𝗥𝗘𝗔𝗧𝗛. At 𝗕𝗿𝗲𝗲𝘁𝗵𝘄𝗲𝗹𝗹, we’ve developed a sensor platform called 𝗖𝗠𝗔𝗣 that detects and decodes breath patterns with precision, helping identify silent stress events like 𝗗𝗮𝘆 𝗔𝗽𝗻𝗲𝗮® and 𝗦𝗰𝗿𝗲𝗲𝗻 𝗔𝗽𝗻𝗲𝗮®. These phenomena affect cognition, autonomic balance, and overall well-being, yet they’re invisible in most current datasets. Breath is a real-time biomarker that can add a powerful new dimension to your models. If Google Research is serious about comprehensive human sensing, we’d love to open a conversation. 𝗟𝗲𝘁 𝘆𝗼𝘂𝗿 𝗺𝗼𝗱𝗲𝗹𝘀 𝗕𝗥𝗘𝗔𝗧𝗛𝗘. 𝗕𝗿𝗲𝗮𝘁𝗵𝗲 𝗕𝗲𝘁𝘁𝗲𝗿, 𝗟𝗶𝘃𝗲 𝗕𝗲𝘁𝘁𝗲𝗿, 𝗕𝗲𝘁𝘁𝗲𝗿 𝗗𝗮𝘁𝗮. 𝗕𝗲𝘁𝘁𝗲𝗿 𝗔𝗜, 𝗕𝗲𝘁𝘁𝗲𝗿 𝗢𝘂𝘁𝗰𝗼𝗺𝗲𝘀. www.xn--ud2hgaaaoza6bu5a.com #BreathAsData #PhysiologicalAI #DigitalBiomarkers #SensorLM #BreatheWell #MentalHealthTech
💡 Great insight
Cool! Im not familiar with wearables. Would there be any chance to run such model (or derivative) on a device like a smartwatch?
If you're interested in how to implement MCP to fetch this HealthKit data and interact with it, check out this guide:" https://www.elastic.co/search-labs/blog/how-to-build-mcp-server
Google legal team now can prepare a file lawsuits to TW gov owned internet thugs for illegally abusing and trespassing Google core systems
Data Enthusiast, Developer, AI
5dThis is exciting