Trip Report: USGS CDI Citizen Science workshop, day 1

9/11/12 USGS Community Data Integration citizen science workshop

Ice Core Lab, Federal Center, Denver, CO



Randy Updike – Rocky Mountain Region

Hazard scientist and citizen participation has been critical in his work. Led a NAS team to a volcano in S. America where 25K people had been killed, sheer chaos and most infrastructure was destroyed. Relied upon the locals who had survived to provide local support and tell them what to go see. Citizenry were most valuable resource – had observed the event and knew which features of it were unusual and remarkable, so they lead the team to places where there were interesting things to see. Great trip, not due to the scientists, but rather because of the involvement of the local people.

Has needed to engage the public in this work, and in the past the technology wasn’t there to make it happen, but today it is. Excited to learn how technologies can support the science.


Linda Gunderson via WebEx – Director, Office of Science Quality and Integrity

As crowdsourcing and citizen science become more accepted practices, this is really changing the way people do work in exciting ways. In last 2 months, got a call from the White House and learned that Teachers Without Borders were interested in working with USGS on hazards project focused on introducing women and girls to science. It was a phenomenal event. Question from TWB – what would you do if you could harness the passion and attention of 100K women and girls across the globe? What an amazing idea! What they proposed to the White House is to have 100K women and girls, teach them a science curriculum, then have them do vulnerability assessments of their community to incorporate into the global earthquake model.

What would you do with the millions of students and everyday citizens who are interested in our world across the US? How could they contribute to the USGS mission? Need to be careful about data quality and rules, but don’t let those things bog you down in the face of the big questions.

Citizen science is a named objective in coming strategy for DOI. They are interested and want it to happen. Part of the reason is because it’s an ingenious way to harness that power, but also develops scientific literacy, inviting students into a career in science, which is a desperate need for long-term growth, innovation, and the economy. Competition for development of citizen science app, working with citizen science groups, have money to put behind it and it will be announced soon.


Jennifer Shirk, Cornell Lab of Ornithology

Exploring the Landscape of Citizen Science and PPSR

Exciting that USGS is exploring identity and role in field of practice. Given the growing landscape of citizen science, how might organizations work together to map out and manage the terrain. We’re exploring this landscape together.

Citizen science is part of the mission statement for CLO; she manages Citizen Science Central, extending involvement in citizen science far beyond birds. Goal of website is sharing best practices and ideas among the community; we need to learn from practitioners and people involved in the diverse kinds of work like citizen science where the public are involved in scientific work. With the growth of citizen science, there’s need for more support for the field.

Sharing brief stories to illustrate the possibilities of citizen science. Terry Root, who’s used data from the Breeding Bird Survey, owes her career to citizen science, documenting bird distribution trends, which wouldn’t be possible without distributed observers. Close to 500 papers have been published with BBS data. She’s also interested in influencing public literacy; how do you do this when you’re working remotely with people you’ll never meet? Terry says she does that by encouraging her students not to hesitate to use citizen science data, with the caveat that you have to ground truth them and determine what data are usable.

Julie Parrish is at UW, where she started COASST to explore status of seabird populations throughout PNW, looking at dead birds on the beach. Designed rigorous protocol for identifying the birds, but also verification with photos and other data. Has been able to understand, publish, and influence management around beached birds to watch for unnatural events. Initially approached Audubon to recruit volunteers, and found birders weren’t a good audience. Needed people who were more interested in the beaches and health of beach ecosystems. She says citizen science is not just about science, but about people doing science about something they love. Learned that you can’t get people to monitor most “important” beach, but need to let them connect to what matters to them, find and celebrate their connection to the place/organism, and that improves retention.

Grupo Tortuguero de las Californias – story of Wallace J Nichols who started an EarthWatch project in Baja to look at populations of highly endangered black seas turtle. Started those expeditions as PhD student because he couldn’t fund the work due to skepticism about even finding these turtles. Partnered not only with EarthWatch but also fisherman and turtle hunters, people whose culture and identity is really linked to the sea turtles. Growing a network throughout Baja and beyond of people supporting research and management that has made a big difference in the recovery of this endangered animal. Who needs this data? Beyond scientists, the locals need and appreciate it. Wrapping up the stories with practical advice: starting these projects, don’t be daunted, basically sold everyone they knew a T-shirt to initially fund the project. Highlight for her at PPSR conference was these people meeting one another.

Reflect on where USGS is in this landscape – by holding workshop and asking questions, USGS is pretty centrally located and can help us come to a better understanding of this terrain in PPSR. Going back to analogy of landscape, how can we collectively map out and navigate the terrain? Lots of work needed just to understand this: what other projects are out there? Who is in my area or organization that I can learn from? What’s already known about how to do it well? Where do I find ideas, resources, papers? What contribution can my project make? These are questions we’re hoping to answer in part through, will be launching a project database this fall to be a centralized resource for project leaders to curate info about projects & pubs in one place.

Lots of procedural things to consider for getting from inputs to outputs. Power of citizen science is definitely in bulk of observation, and also in experiences – not just deeply meaningful momentary experiences, but experiences that let people know something scientifically and more interestingly. Your work in citizen science will ultimately influence both, but matters in how you achieve these things. Just like good science & good education, science literacy needs to be approached thoughtfully, intentionally, and deliberately.


Citizen science policies and challenges session

Introduced by Annie Simpson

Goal is clarifying USGS & govt policies, provide platform for citizen science researchers so that they can share instances of where policy hasn’t worked well for them – opportunities for lessons learned. Want to document how to work better within policy limitations and structures.


Lorna Schmid, USGS Infrastructure & Operations Team Lead

Policy Issues with Citizen Science App Development

Why do we need a mobile framework? Addressing multiple challenges and establish support tools and processes – all of the reasons we all know – unknown, limited resources or support for collaboration, funding, data sharing. Goal of project is to develop one stop shop that enables workflow processes, code repositories, mobile community, and common training.

Workshop outcomes on mobile: understanding mobile ecosystem – hardware, app development & delivery. Initial focus on app development life cycle. The process of mobile produce development – 4 phases starting w/ ideation and moving through to delivery. Ideation process includes steps for peer review to ensure that it’s not duplicative and has all the needed parts and will reflect well on USGS. Fairly technical process workflow.

Development workflow: check policy issues, develop communications and marketing plan, mobile community peer review, science review, plus security checklist and specific checklist for development document. Then review/approval for external products (e.g. created by a partner?) involves user testing, security approval, policy review, cost center approval. Once the app is published, need to monitor its performance – if it has low ratings in Apple Store, need to review the product, which reflects on the organization. Issues like scope changes – minor typos, major changes go back to development process.

Recap: training a mobile community, need this to vet applications. Recognize audience dependence and need for phased workflow with documentation. Next steps are a FY13 proposal to continuing developing framework, built out mobile community site, host town halls, continue working with team and advocate for a stronger community.


Rob Thieler, USGS Research Scientist w/ Coastal Marine Program

Researcher Point of View: iPlover: A Smartphone Application to Characterize Piping Plover Nest Locations

Found that research scientists collected data in field notebooks but most had smartphones – why not collect right on the phone? Learned several lessons from this.

Business case: sea level rising, this is a very big deal. What to do about it? Natural and cultural resources at risk – produced preliminary map of vulnerable locations. Need to inform decisions about climate change w/ uncertainty management framework; scientific priorities and practices need to change to make it happen.

Looked for poster child, picked Piping Plovers to work with USFWS and NPS, this is his cute charismatic species slide. Piping plovers prefer hazardous beaches for nesting, often subject to storms and other issues. The plovers are hard to see on the beach, nests are even harder to see but are out in the open. Got involved to support decision-making for DOI agencies; listed species, DOI mgt responsibility, interesting and specific habitat requirements that can be predicted, can then feed that back into population dynamics models. Their approach is looking at cascade of info through Bayesian Networks.

Geologist’s view of plovers’ thinking, trying to figure out how to gather data (give them a widget, not a widgeon!) Deploying trained observed and providing protocols, vastly increasing spatial domain, dealing with fuzzy observations, and already know what to collect – screens of app interfaces.

Lessons learned: HTML5 + JavaScript works best. Most devices have issues – wide variations in baseband chips, browser support uneven. Just because someone has a smartphone doesn’t mean they know how to use it! But valuable data can be collected. More lessons: little guidance on mobile apps – Privacy, PII, iconography, branding, OpenID (authentication failure for offline data collection), OMB approval.

24+ organizations monitoring plovers; USGS needs a strategic plan for mobile apps for many reasons. Project needs data to drive research & facilitate decision support. Got a smartphone app working but hurdles included lack of guidance and policy impediments. Demonstrated great opportunities and challenges.


Cheryl Smith, USGS Volunteer Program Coordinator (WebEx)

USGS Volunteer Program and Handbook

Views here are from official USGS volunteer management guidelines. HR oversees volunteers but maintains the handbook, is contact for DOI, and give employees access to vol website to post opportunities. Also fields calls and answers questions about vol programs.

Care and feeding of volunteers is done by science centers who work w/ volunteers – the fun part. Volunteers have to track time spent and turn in hours for evaluation. Dept is encouraging more bureaus to develop more vol opportunities due to budget constraints and major contributions of volunteers to USGS. Volunteers have to pass fingerprint check, be US citizens, or have a work visa. That just gives access to the building. Unless they need IT or special access to buildings, that’s the only security requirement for working less than 180 days.

Need to sign special agreement to be a volunteer, no minimum age requirement, but minors need emergency care for minors medical release. Hazard review must be conducted if minors will be involved. No underwater diving or use of firearms.

Volunteer for Science Handbook includes sections on health and safety, related to hazardous conditions, e.g. in labs or boats/watercraft, unscheduled airplanes – this makes gov’t liable for claims against USGS. Main reason to have volunteer properly signed up w/ written agreements, detailed description of their volunteer job, etc. Also includes ethics rules, volunteers have to abide by USGS ethics, e.g. not sharing sensitive data. Constraints in place for all volunteers, scientists emeriti are also part of this scene, become volunteers subject to these constraints, even as former employers.

Post volunteer opportunities on


Eric Wolf and Barbara Poore, USGS Center for Coastal and Watershed Studies

Researcher Point of View: How are USGS Citizen Projects Impacted by Government Policies?

Current policy issues – Privacy Act (1974), Paperwork Reduction Act (1995), Data Quality Act (2002) fundamental science practices, USGS Volunteer Handbook (2011). Open Government Directive in 2009 is a bit less restricted than USGS Vol Handbook.

Paperwork Reduction Act – supposed to reduce duplicative work (part of the point of some citizen science). Volunteer policy about fingerprinting and long forms really imposes a burden on volunteers. Activities addressed in the handbook give a lot of guidance. Doesn’t cover the lightweight stuff – Acadia Learning, MLMP – do teachers stand in for parents with students groups (in loco parentis)?

Citizen science is global, but people who aren’t citizens or legal aliens can’t be involved. No one outside of US can participate, and there are no guidelines for how to work with these folks. Few guidelines on mobile apps, as we saw. Also mining data from social media that are publicly shared. Concerned about potential for ethically questionable activities in data mining, even if following the law – does government have different responsibilities than corporations?

Shape of citizen science today: who asks questions? Experts. Geography of questions – local, primarily due to policies; participants also local. Knowledge/skills require being fairly educated. Consent must be documented. Research design done by experts. Geography of technology is stationary. Usability is complex (i.e. sucks). Data use is pretty much only for experts, are people collecting the data able to get the data? Basic ethos is that data are shared and are public domain.

In the future, this balance could change. Citizens could ask questions, geography of questions and participants could be global, no knowledge/skills required; unconscious consent; citizens involved in research design; portable technologies; simple usability – no one complains about something being too easy to use; community data use, let everyone explore the data.


Shari Baloch and David Newman – USGS Office of Enterprise Information (WebEx)

Lightning Talk: Freedom of Information Act – Employee Responsibilities GIP 140

3 Laws impacting science activities – PWA, Privacy Act, FOIA

Paperwork Reduction Act: requires OMB approval for structured info from 10+ members of the public annually – e.g. a form, web page survey, focus group, phone survey, mobile apps. Public is basically anyone not affiliated with a federal agencies. Point is minimizing burden on public, improve quality and use of federal info. Process takes 6-9 months with 2 public comment periods, so contact them early. If you know of existing data collection without OMB approval, contact them to figure out what the options are for getting approval.

FOIA – most records are subject to it, and USGS promotes transparency – so don’t badmouth anyone in your documents because it could be requested. Some records might be withheld through exemptions. Employee responsibilities: request FOIA rep to coordinate response because of 20-day time limit which is subject to litigation and that’s happened before. Also need to follow steps for FOIA requests (GIP 140).

Privacy Act – responsibility for Personally Identifiable Info – PII. Need to deal w/ privacy act system of records (SORN) so anything that uses unique identifier needs approval from Congress and the DOI Secretary. Suspected breaches need to be reported w/in an hour!


Paul Earle, USGS Hazards, for Sophia Liu

Researcher Point of View: USGS Tweet Earthquake Dispatch (@USGSted): Using Twitter for Earthquake Detection and Characterization

Intro slides w/ XKCD comic! Geospatial crowdsourcing – typology includes types of crowds: survivors, diaspora, social net, expert, col; types of sourcing: tagging, mapping, curating, feeding, harvesting.

TED started in 2009 after major earthquake that killed 60-70K people. USGS was criticized, people knew about it on Twitter before USGS did, it got some press. Any seismologist knows it happened but can’t estimate characteristics well enough, so that’s well enough known by the public. Question was how to make that info work for USGS – couldn’t even access Twitter from government computer!

Worked w/ USGS social media group to figure out how to get access to Twitter and do some initial test cases. Found potential ways to use it – just detection, like bloggers said – could also use to get rough location because most tweets have location metadata, sometimes GPS lat-long for about 2%, also have city names, and getting down to the city level is pretty good for earthquakes. Speed of light travels faster than the speed of sound, so using Twitter is faster than seismic stations, which are pretty sparse globally. E.g., in China, the seismographic data is delayed by 15 minutes before it’s sent to USGS, so they had no info about that huge earthquake in 2009.

Got framework set up by a computer science student, who then got a job – so they got another student, but there’s a lot of variability in student motivation and skills. After that person, got a little more money and got some professionals in on it to build a robust system to gather tweets and send out public messages about seismically verified events. That framework provided ability to build other applications off of it. Product has been rock solid, big advantage of using professional developers. One of few systems that works better in real time than on static data!

Performance: has had detections in as little as 18 seconds! Average of 3 events per day, 95% correlated with true seismic activity. Faster than $200M seismic tracking system. Alert delay is 26 – 237 seconds. Magnitude of detected events from 2.5 – 7.4 on Richter Scale. Even though it requires jumping through hoops, it really does what it’s supposed to do, there’s no way to do this as an individual researcher.


Kevin Gallagher – Associate Director, Core Science Systems

Lots of discussion about engagement driven about new technology, but citizen science has been going on very successfully for a very long time, particularly conservation efforts. So what’s really new here? The opportunity has been here for a long time, makes mission more relevant. Competing at national level for funding with every other agency. Opportunity to engage w/ citizens lets USGS showcase assets and work, and improve mission and relevancy in public eye.

What’s really new is the tech that lets us engage in new ways, e.g. mobile devices. Need diverse talents to realize the potential, how to simplify and move forward. Policy is a big hurdle, so a subcommittee that can work on those issues to help support smoother, faster approvals would be valuable. Building out mobile computing infrastructure also needed, and sharing stories too.

Some discussion of attendees’ goals in being at the workshop – appreciation from “outsiders” for being present and being able to engage, appreciation from USGS for having non-USGS people involved. Need to start getting the house in order; there’s no question that as things move forward, there’s need to work with partners and steal shamelessly from what other agencies are doing. Committed to making this successful, creating forums, investing in staff support, assist in moving policy, put funding on the line to make it happen.


Matt Cannister, USGS Biological Technician (Southeastern Ecological Science Center)

Researcher Point of View – An Overview of the USGS Nonindigenous Aquatic Species (NAS) Online Sighting Report System and Lessons Learned from Mobile App Development

75K+ records for 1K+ nonnative freshwater (mostly) species, each record with up to 60+ data fields and public can see 20-30 data fields. Also show point and HUC (watershed?) distribution maps, e.g., for Lionfish. Also does species accounts with fact sheets for public reference. Now have an alert system for Early Detection/Rapid Response aspect – big aspect of invasion biology, get rid of the problem before it’s a problem. Alerts generated when species observed in novel area, goes out through RSS feed and tweeted, also email alerts according to state, group of species, specific species, etc. Have issued almost 1200 alerts, 62% from personal communications through Online Sighting Report form, 83% of confirmed Lionfish are from public reporting. People can fill in various fields, upload images for verification, and lots of disclaimers.

Want to turn that into a mobile app. One constraint with online reporting is follow-up after field time to enter data, potential of losing details. Want to make an app like iPlover but for aquatic invasives. They have the app! Working prototype emerged in 3 months, fully functional with back-end server support. But the issues making it a “sort of” include: no technological issues, no financial issues, problems are all bureaucratic. Have been waiting since November 2011 for DOI to approve Management/Review/Approval process for app development before the app can be sent out to app stores.


Policy discussion

TOS for Apple and Google app stores are not acceptable for USGS, though some projects have achieved this one-off, e.g. NPN had one. NPN collected millionth data point, and did emergency data collection approval with OMB, had to jump through many hoops, got scolded, have 6 month approval and now have to go through the rest of those steps. Just getting sign-off on TOS doesn’t mean any of the rest of the stuff is ready to go.

There is now a fast-track for OMB that is actually working, takes a lot less time to use a web form – as little as three weeks for questionnaire approval compared to 2 years. Hard to speak to all the parts of the process, maybe need a checklist of how to go through all of these steps.

Need to avoid hiding USGS work in universities, etc. Already fighting for funding, need credit for this work. Things like mobile app dev framework will help navigate some of those other issues. Having people use universities as an intermediary dilutes credit to USGS. Need for partnerships that meet all players’ needs. Concerns about university partnerships being less stable than USGS needs, e.g., due to funding cycles.

Nothing stopping USGS from developing tools (subject to review process) and releasing as open source, e.g. technologies and protocols which are huge amount of effort to develop. That makes much bigger potential impact.

Need vision statement and direction for moving this forward at USGS level. How much would a general strategic plan help? It would make a difference to establish big questions to answer, how to work together, the kinds of communities to form, and issues that need to be addressed, e.g. policy. It could be a very simple framework, but that always helps.

When were these showstopper policies put in place? Are they outdated? Doing the workarounds could show the value of updating those policies, because that’s the really big roadblock.

What have been major issues with PII in citizen science at USGS? Wouldn’t say mistakes are made, requirements are always understood, but organizers are sometimes doing it their own way rather than going through the official processes. Some misunderstanding of what PII is – just a name isn’t PII, but two different pieces of related, linkable PII is. Lacking a unique identifier makes the data problematic for research use because of data quality concerns.

Policies so complex (and numerous) that there’s room for multiple interpretations.

How can USGS scientists figure out policy and resources to recruit effectively within policy guidelines? Hard to figure out what the full process is, right now it’s word of mouth, don’t know where to go for that info. There’s a new website to try to help document all that info, building a one-stop-shop.

Memo from OSTP – things like 3rd party ID systems embedded in them, but virtually no one seems to know about that memo. Also don’t know when there are updates to policy – keeping people up to date on changes. Collecting up the policy knowledge in the room would help generate a pretty comprehensive list.

Many of these issues are burden to domain researchers, they shouldn’t have to deal with all the policy and other related stuff outside of their areas of expertise and be pulled out of their research. Same issue in data management. Need an ombudsperson or liaison to help researchers navigate these issues.

Sally Holl discovered Code for America, suggests creating Code for Science proposal. Science sector lags behind other areas, and a lot of government data is not available, that which was is not all that usable. Need to be able to host hackathons with staff providing bridges between domain science and IT development. Doing this would require hiring a group rather than individuals who all need to be fingerprinted, etc. contest – USGS has never done this before, Abby is working on putting a proposal for a challenge together.


Engaging the Public in Scientific Research Session


My talk – The Evolving Landscape of Citizen Science


Barb Horn – State of Colorado (WebEx) NWQMC

Engaging the Public in Scientific Research Perspectives from the Volunteer Monitoring Community

Volunteers as workforce – different model and set of resources to support; involving volunteers makes it more than just research. Volunteer monitoring community is sort of its own community, especially water quality monitoring. Trying to explain to National Water Monitoring Council, the volunteer monitoring projects do pretty much the same thing as professionals. Data management and technologies are an issue as for everyone else.

Started out as primarily education and stewardship, but has moved toward decision-making and taking action. Planning phase steps really don’t show how to deal with the people part – figuring out who the decision-makers are, what info they need, and that’s what should drive the rest of the planning steps.

Planning process starts with what exists, formulate objectives, how to manage and analyze, how to deliver to decision-maker, evaluation of project success, what needs documentation, and what’s the appropriate network to use those assets. General approaches are tiered – based on amount of rigor and time involved. The advantage of the simple starting point is that it’s a good starting point and you can potentially move people up into a higher tier, but the disadvantages have to do with data interoperability, not the right level of data for the particular location. Other approach is one-size-fits-all, all the resources go into one product/service/deliver/support, supports interoperability, but costs more up front, may lose people who want to do the more rigorous work, and might not meet all the objectives.

River Watch is a one-size-fits-all. Annually they monitor around 700 stations a year, including field, lab analysis, physical habitat, macroinvertebrates, photo. Everyone does same training and if they can’t meet QA/QC cut-off points, they can’t continue to participate. Equipment issues – loaning, giving equipment w/ either the group keeping the equip or they reclaim it, etc. Standardization is required across methodology for this kind of data.

Analysis approaches: some do it all, others collect and hand over to experts, some do a combination of those approaches, but in all cases it has to be standardized. Study design may be very different for different participant groups, and sometimes a group starts with one study design and then change the design to match the outcomes, then maybe do intervention and later go back and re-check. So monitoring can evolve and study design changes. Survey on demographics and motivations; main purposes are passion and making a difference. Should document volunteer responsibility, what sponsors support, and responsibilities – typical volunteer coordination considerations.


Greg Newman, Colorado State University Cyberinfrastructure Support for Grassroots Conservation, Citizen Science, and Community-based Monitoring

Learn from the masses, and then work with them – the art of teaching is the art of assisting discovery – these are things that make citizen science work. Keys to success: volunteers, scientists, & mavens who make the projects happen; need all three roles to make a project work.

Today we have countless projects, creating volumes of data, and they often focus on pre-defined topics – birds, phenology, etc. What if you want to measure something else? With that question in mind, felt there was a need for a platform to help new projects get going. Got NSF money and built such a system, facilitate creation of projects, encourage cross-project partnerships, support data management. Platform supports both top-down and bottom-up approaches to project design. Steps in project development are far more complex than most people realize.

Simple web portal for setting up projects – personalizing with photos, etc, build their own data sheets, etc. System allows membership gatekeeping to help support data quality, build data sheets from typical template items, add specific attributes and site characteristics. With specifying attributes, they have a growing number of attributes to manage, e.g. species attributes like height, weight, etc. The goal of doing this is standardized, interoperable, and flexible data – integrating data from Pika Watch and Pika Net, all adopting same protocols. They determine what to measure – so far, species attributes and site characteristics. But it could be kW energy use in houses as a new attribute in the system. Their mantra: If you want to measure it, you can measure it, but be sure no one else is already measuring it!

For data analysis, there are automatically updated canned reports, but working to build custom analysis where they can pick their own variables. In the future, hoping to develop open APIs and extend interoperability. There’s also a feedback tool for surveying volunteers. So far they’ve had 40+ projects, 7K+ observations, tech reports, academic publications, etc.

Making online citizen science successful: shared praxis, amplified collective intelligence, restructured expert attention. What motivates people? Sharing ownership, prompt them along the way, reward and respect them. Are rewards effective? Communicating and sharing results is number one – that’s why they’re participating. Show progress, make sure they know they’re making headway toward an answer to the research question. Give attribution – really acknowledge them. Harnessing these motivations: adopt common protocols, encourage standards while remaining flexible. How to succeed? Train the trainer, training a coordinator who trains volunteers, crowdsourcing to distributed groups. Each model of participation has their place in the bigger picture.

So what’s next? Event management, online training support, volunteer tracking, improving project social media marketing, integrate w/ SciStarter to promote projects. Expand analysis, customized reporting, mobile apps, and blogging features.


Greg Matthews, USGS National Map Corps

Volunteer Map Data Collection – The National Map Corps

National Map Corps – volunteer geographic info used for improving topo maps, but eventually got shut down. Restarted in 2010 with OSM; started with software – OSM had quite a few advantages, but needed a willing partner, which ended up being State of Kansas data access and support center. Took their data set and cross-walked to USGS best practices model, then loaded into the system so editors could edit roads. Kansas focused on state routes, while USGS focused on interstate routes. Aside from a few hitches, it has worked well.

Guidelines for editing also important – they had the same reasons for doing the editing, which is fairly rare. Successfully deployed the system customized to USGS specs, worked with co-editing using cross-agency shared specs, and used lessons learned for phase 2. This is a large federal agency, but had 3 staff w/ about 1/3 of their time each, so it was a large project.

Phase 2 involved structures (buildings), volunteers from student bodies, evaluating quality/cost to ensure there would be ongoing internal support, also very important to make it available to The National Map and meet all their quality needs. Still successfully using OSM technology stack, and trying to contribute the data back to that community as well. Editing interface was much improved using user-centered design. Contributed points include a number of attributes. Was specifically collecting the info for the structures program. Seeded project area with gazeteer info on structures, shown in red, checked over by editors and then approved or changed, and then shown in yellow, so you don’t get duplication of effort, map itself shows status of editing tasks. Also involved peer review – editors checking each others’ work.

Outreach and communication was easy using Google Sites, so it was unofficial and only available to those who were invited to participate. Forum was also set up in the hopes of reducing volunteer management load as well as collecting feedback. Basic thank-you certificate of achievement, considered it the least they could do, certificate shows how many structures were included on The National Map – pretty popular with some people.

Had over 1100 points contributed or edited, lots of new points added, and some historical points removed. Positional accuracy also improved – from 50% accuracy to 84% accuracy, and then after peer review 92% accuracy without USGS involvement, which met National Map standards of 90% accuracy w/in 40 feet.

Phase 3 – now collecting a new type of data guided by needs of The National Map. Content management in Confluence, with directions, video tutorials, no login and now official with approval. Editing interface is purpose-built to make it as easy as possible to contribute to the system. Adopt-a-Quad system lets people review all the data in a quadrant, more technical job and asks a little more of those people, going well so far. Volunteer contributions after public announcement 3 weeks ago: about 1500 data contributions, almost finished half of Colorado in very short order. Hoping to scale up beyond CO in next few months for these 10 structure types, potentially serving a larger set of purposes beyond USGS.


Kristi Wallace, USGS Alaska Volcano Observatory

The Alaska Volcano Observatory Citizen Network Ash Collection and Observation Program

Collecting observational data, no outreach people in this program, so it’s an extension of their work that they’re taking on. Research focus is on volcanic ash, involving volunteers in collecting observations and samples of ash. Over 100 volcanoes and most are distant from Anchorage, so they need distributed contributors with timely access to ash plumes. The data are used to ground truth satellite data and ash fall models, and another important need is met by collaborating with NWS Ashfall Advisories, since they are the agency that releases those advisories in real time. So timeliness is really important. The data points are used for isomass maps and samples let them understand composition of the ash. Aleutian Arc volcanoes require volunteers to collect data.

The ashfall models are not yet public, but that ground truth validation is really needed, so they can be viewed by the public during eruptions. They use a variety of training forums, with written instructions, video, workshops, and targeting key citizen groups. Take a variety of observation and sample types, complex to simple; eyewitness accounts, thickness measurements, measured-area sampling, time incremental sampling, bulk sampling. Give people the choice to report what they feel able to do; provide very explicit and simple instructions for all those activities.

Website includes dashboard for each volcano – intro material, reporting, and send email. They put instructions in a variety of media to make sure everyone can use the format that works best for them. Instructions include lots of pictures to make it very obvious, diagrams, etc – might feel like you’re insulting people, but a lot of people just don’t know how to measure a baking pan’s dimensions. Data sheet includes metric ruler because people use inches when they mean centimeters. They also hold community workshops in downwind communities, great way to get long-term participants who become really loyal and they keep in touch – she calls in advance when precursor activity starts up so they can refresh themselves on their procedures.

Also working with National Weather Service Spotters, who are highly trained, just as excited about volcano science as weather science, and they tend to be very valuable. Lots of observations and samples sent in – about 250 observations and 55 samples were sent in. They send out hand-written thank-you’s with pics of the eruption that they sent a sample for to make sure people know it’s appreciated. Sometimes samples aren’t well documented, but there are some that demonstrate extremely careful documentation of their samples and procedures.

Just added ashfall recording page. Don’t have an outreach person, she has to be in the field during eruptions or have volunteers to do it, very stressful. So they put together an online database, 4-step form based on data that are entered. Time/date, several ways to identify locations, yes/no on whether ash was present, details about the ashfall event, and then if you sampled, more questions on the type of sample and comments. Optional contact info – need to be able to follow up on many of the observations for weird reports or special data collection, allows people to opt out of providing PII. Have a map that shows 24-hour reports of ashfall, shows only yes/no whether there is ash or not, and locations aren’t very specific, but specific enough to be useful for public.

Can’t use this system yet due to bureaucracy, but designed it in a way that is not volcano-specific, so it can be used globally. Gave a copy of software to New Zealand to use – they will probably have it running before the US does. Back end tracking with CRM-like functionality, etc. Image database now available, growing fast, and photographers love that they get credit for it.

Leave a Reply

Your email address will not be published. Required fields are marked *