The BodyTrack project develops open source tools self tracking tools to aggregate and visualize data from diverse sources such as wearable sensors, observations from mobile apps, photos, and environmental data. Our goal is to empower individuals to explore potential environment/health interactions (food sensitivities, asthma or migraine triggers, sleep problems, etc.) and better assess strategies they think might help.
This work is inspired by the experiences of people who have improved their health by discovering certain foods or environmental exposures to avoid, or learning other types of behavioral changes. Many describe greatly improved quality of life, overcoming such issues as chronic problems with sleep, pain, gastrointestinal function, and energy levels. In some cases, a medical diagnosis had led to treatment which mitigated symptoms (e.g. asthma or migraine headache), but the discovery of triggers required self-tracking and self-experimentation. Importantly, the process itself appears to be empowering: people who embarked on this path changed their relationship to their health situation even before making discoveries that helped lead to symptom improvement.
The process such people describe typically involves cycles of hypothesis generation, modification of inputs, introspection, and evaluation. Keeping track of experiences and context that are potentially important is a significant challenge. We believe that better tools for collecting and exploring relevant data and observations will help empower a broader set of people to embrace an “investigator” role in their own lives.
The core of the BodyTrack system is an open source web service which allows users to aggregate, visualize, and analyze data from a myriad of sources. Examples of data sources include physiological metrics from wearable sensors, image and self-observation capture from smart phones, local environmental measures such as bedroom light levels and in-house air quality monitoring, and regional environmental measures such as pollen/mold counts and air particulates. Examples of physiological data sources are the Zeo sleep monitor, activity monitors from BodyMedia and Fitbit, Withings’ WiFi scale, and Dexcom’s continuous blood glucose monitor. Photos can be used to record intake of food and medications, social contacts, and visible symptoms such as rashes or eczema. Smart phone self-observation apps such as Mymee or Tonic can be used to record relevant subjective experiences, such as headaches, pain, energy levels, or mood.
With BodyTrack you can explore relationships among these various types of data on a common timeline, fluidly zooming to different time scales from years to microseconds. We can gracefully handle and co-visualize data with heterogenous sampling rates—from observations of infrequent symptoms through densely sampled continuous tracking data with billions of sample points. The BodyTrack data handling system creates and serves data tiles at varied levels of detail to allow fluid zooming to different time scales, analogous to how Google Maps or GigaPan serve spatial map or image tiles at various zoom levels. If a given data tile would contain too many samples at the requested level of detail, the data points falling in each bin are merged and stats such as standard deviation and count are preserved. Tiles with sufficiently few samples that they all fit contain the original data and timestamps. The viewer dynamically requests data tiles at the appropriate level of detail as the user zooms in and out for a fluid experience.Users of the BodyTrack system have explored a variety of topics. The following are some examples of users for whom the process of self-tracking and reflection led to insights and customized strategies for improving aspects of life that had been troubling them:
During this session we will share further details of these technologies, techniques, and experiences, show a variety of data acquisition apps and devices, demonstrate our data exploration tools, provide pointers on how to get involved with self tracking as a user, developer, and/or participant in the Quantified Self community, and answer attendee’s questions.
Anne Wright is Co-principal Investigator and Director of Operations for the BodyTrack project in the CREATE Lab at Carnegie Mellon University in Pittsburgh, PA. She received B.S. and M.Eng. degrees in computer science and electrical engineering from the Massachusetts Institute of Technology in 1996. After leaving MIT, she co-founded Newton Research Labs, a successful robotics and computer vision company, then joined the Intelligent Robotics Group at NASA Ames Research Center where she served as Lead Systems Engineer for Prototype Mars Rovers. While at Ames, Anne became interested in how to harness sensing and data visualization technologies and techniques originally developed for the rovers to help people “debug” diffuse environmentally related conditions such as allergies, food sensitivities, asthma and migraine triggers, etc. She moved to Pittsburgh in 2009 and spent a year studying biochemistry at CMU. She co-founded the BodyTrack Project in 2010 with the support of the Heinz Endowments of Pittsburgh. Through the BodyTrack Project she pursues a multi-faceted approach to improving health empowerment for people affected by such diffuse conditions, including open-source technology development, aggregation and visualization of data from existing devices and data sources, collaborative development of common data interchange formats and APIs, development of custom devices, and cultural engineering. She also seeks to identify and catalyze synergistic efforts in this space such as the Quantified Self, Quant Friendly Doctor, Locker Project, and open mHealth movements.
Founder of Fluxtream.com
Rich Gibson works on the Gigapan and Explorable Microscopy Projects for Carnegie Melon University and NASA’s Intelligent Robotics Group, and independently creating high resolution portraits of people and developing new ways to archive physical spaces with explorable images.
He is a bricoleur hacker, artist, programmer, author, and builder.
He helped create the Neogeography movement, coauthoring Mapping Hacks and Google Maps Hacks. The process of working with and exploring how we interact and explore space lead him to the more generalized world of providing both context, and detail with explorable images.
For the past four years he has been obsessed with creating new ways to capture and use high resolution images of everything, including the Chaos Communications Camp in 2007, Volcanos in Arizona for NASA, the incredibly tidy offices of Monochrom in Vienna, details of cell metosis in mouse testis with sub-micron resolution, and portraits of people with the details of landscapes.
For information on exhibition and sponsorship opportunities at the conference, contact Sharon Cordesse at (707) 827-7065 or firstname.lastname@example.org.
Download the OSCON Sponsor/Exhibitor Prospectus.
View a complete list of OSCON contacts