13
Jun 12

Responder Maps: Week 3

Accomplished

Efforts to understand stakeholder/user goals and needs continued in Week 3. Documents reviewed included user personas developed for MapMixers.org (an earlier incarnation of Responder Maps.) These personas (and related scenarios) described the backgrounds, goals, and needs of information specialists working with first responders to improve situation awareness, pre- and post-incident planning and analysis, and reporting.

This information helped frame discussions I joined at the DMI plenary (hosted by CMU) where local software entrepreneur/firefighter, Cal Blake, introduced a small but diverse group to FieldApps.net. Attended by active firefighters, community leaders, technologists, and emergency response officials, the plenary discussion touched on various issues relating to Common Operating Picture services including Responder Maps. The specific app demonstrated by Cal displayed features including standpipes, water mains, power lines, and boundaries, all mapped to topology in a specific response area (Stanford’s campus). He also described functionality supporting data annotations and contributions from users. (Later, Cal and another Palo Alto firefighter described to me the binder maps stations keep to record details specific to oft-visited structures.)

Cal acknowledged issues relating to data provenance and validation, providing a useful entry point to discuss Responder Maps. As currently constituted, Responder Maps is positioned to serve as an exchange for map layer data, serving as a “translator,” normalizing and integrating data from numerous sources. Once established, the data layers catalogued/hosted by Responder Maps will be available to power a variety of map viewers and applications (in addition to the map layer mashup viewer hosted at ResponderMaps.org.) (This ecosystem is analogous to another area I’ve explored a bit, data as a service or DaaS.)

Together, these topics, as well as matters relating to data schemas and mapping standards, yielded useful areas of inquiry to raise as outside organizations are engaged to explore data partnerships.

Challenges

As Responder Maps evolves into an enabling service/platform, requirements gathering specific to data provider partners has come into focus (exposed by the growth of numerous map viewers built by organizations and unaffiliated developers and the proliferation of location-based datasets.) Exploring and exposing valuable datasets requires outreach informed by and focused on the goals and needs of end users, e.g., first responders. To facilitate this agenda, I’ve produced a short video treatment demonstrating the use of Responder Maps by firefighters in the field. Once we validate the script with professional contacts and produce the video, we plan to share it with outside parties to spark discussion of the Responder Maps vision and the need for sample data (as well as related functional and business requirements) to iterate and improve the service.

Goals

Finish production of the short demo video (with end user input) to communicate vision and engage partners in requirements gathering, product design, and development. For additional context, note that I have separately posted another blog entry containing artifacts and deliverables sequenced by delivery date. Status updates to the underlying Google Doc will be made on a weekly basis.

Enhanced by Zemanta

05
Jun 12

Responder Maps: Weeks 1-2

Introduction

During weeks 1-2, I focused the bulk of my efforts on gathering resources to research user-centered application development and design. Initial meetings with Mike Prince/Citizen 911 were promising but quickly led me to conclude high up front investments in contextual inquiry and requirements gathering would be necessary to define and validate a viable product, leaving little time to address usability and design issues.

Accomplished

Further exploration ultimately led me to the Responder Maps project, DMI’s recently revived effort to integrate data with mapping services to display a Common Operating Picture crisis map. As constituted, many of the major features (functional requirements) have been researched, built, and/or logged for future iterations. At this stage, more effort is needed to engage and attract stakeholders (including agencies like BART) to explore their unique datasets and any specific functional and nonfunctional business requirements that may accompany them.

Challenges and Plans to Address

Key to this process is the ability to “sell” stakeholders/data providers on the benefits of sharing their data and on the overall user experience of using the RM tools. To accomplish this, we will create collateral we can share with stakeholders remotely or in person, e.g., a demo video introducing the product. These materials will need to meet the quality and fidelity expected by sophisticated users.

Thereafter, we plan to put artifacts including prototypes and a usability study into play to help validate and identify needed improvements to the product. Recruiting participants and identifying features to test/share will require ongoing efforts and investigation.

Goals

To produce the collateral, Trey has connected me with one of his design interns. This week, I will draft a video treatment for review that we can put into production as early as next week. The second challenge will require more ongoing efforts, e.g., attending DMI meetings and arranging followups with specific stakeholders to recruit participants, working with additional team members to develop prototypes, reviewing product requirements and background materials, etc. More on these items later.

Enhanced by Zemanta

16
Jul 11

Fusing Film Facts with Google Maps

Some of the most information-rich data mashups (including Propublica’s Tools & Data blog, BBC’s Datablog, Trulia’s crime map, and the Bay Citizen‘s bike accident app) show off robust mapping tools well within the grasp of hacks and hackers alike. The visualization above, powered by datasf.org and Google’s Fusion Tables, shows just how easy it is to mashup metadata and locations using Google’s geocoding engine.

Among the rich troves available from DataSF, an interesting record capturing shot locations and movie minutiae gathered by the SF Film Commission provides an interesting gateway to the city’s cinematic past. Before deeper analysis, a few threshold data cleansing issues should be addressed:

  • A simple CONCATENATE operator to append “San Francisco, CA” to sparse address information is all that’s needed for Google to situate each record.
  • Next, upload the full table (including movie titles, actors, and facts) and plot it on a Google Map using Fusion Tables. (The actual steps, import and visualize, are pretty self-explanatory. Start off here: http://www.google.com/fusiontables/Home.) This dataset imports fairly cleanly; for less structured corpuses, Google Refine is indispensable.

The real fun comes with the numerous display and filtering options available. (See http://www.google.com/fusiontables/DataSource?dsrcid=1151606.) After visualizing the full dataset, goto the View > Filter options and limit the display to specific actors, a single movie, or a release year date range. Integrate the results with data from additional tables, say a MUNI bus route map, and roll your own Star Map for publication on your website. The chart view below provides a handy summarize or “count()” function for histograms.

Thanks to the Bay Citizen (specifically, @tinio for the Fusion Table demo), SF Hacks Hackers, and DataSF for the weekend of #datasf presentations and knowledge sharing.