Trip Report: USGS CDI Citizen Science workshop, day 1

9/11/12 USGS Community Data Integration citizen science workshop

Ice Core Lab, Federal Center, Denver, CO



Randy Updike – Rocky Mountain Region

Hazard scientist and citizen participation has been critical in his work. Led a NAS team to a volcano in S. America where 25K people had been killed, sheer chaos and most infrastructure was destroyed. Relied upon the locals who had survived to provide local support and tell them what to go see. Citizenry were most valuable resource – had observed the event and knew which features of it were unusual and remarkable, so they lead the team to places where there were interesting things to see. Great trip, not due to the scientists, but rather because of the involvement of the local people.

Has needed to engage the public in this work, and in the past the technology wasn’t there to make it happen, but today it is. Excited to learn how technologies can support the science.


Linda Gunderson via WebEx – Director, Office of Science Quality and Integrity

As crowdsourcing and citizen science become more accepted practices, this is really changing the way people do work in exciting ways. In last 2 months, got a call from the White House and learned that Teachers Without Borders were interested in working with USGS on hazards project focused on introducing women and girls to science. It was a phenomenal event. Question from TWB – what would you do if you could harness the passion and attention of 100K women and girls across the globe? What an amazing idea! What they proposed to the White House is to have 100K women and girls, teach them a science curriculum, then have them do vulnerability assessments of their community to incorporate into the global earthquake model.

What would you do with the millions of students and everyday citizens who are interested in our world across the US? How could they contribute to the USGS mission? Need to be careful about data quality and rules, but don’t let those things bog you down in the face of the big questions.

Citizen science is a named objective in coming strategy for DOI. They are interested and want it to happen. Part of the reason is because it’s an ingenious way to harness that power, but also develops scientific literacy, inviting students into a career in science, which is a desperate need for long-term growth, innovation, and the economy. Competition for development of citizen science app, working with citizen science groups, have money to put behind it and it will be announced soon.


Jennifer Shirk, Cornell Lab of Ornithology

Exploring the Landscape of Citizen Science and PPSR

Exciting that USGS is exploring identity and role in field of practice. Given the growing landscape of citizen science, how might organizations work together to map out and manage the terrain. We’re exploring this landscape together.

Citizen science is part of the mission statement for CLO; she manages Citizen Science Central, extending involvement in citizen science far beyond birds. Goal of website is sharing best practices and ideas among the community; we need to learn from practitioners and people involved in the diverse kinds of work like citizen science where the public are involved in scientific work. With the growth of citizen science, there’s need for more support for the field.

Sharing brief stories to illustrate the possibilities of citizen science. Terry Root, who’s used data from the Breeding Bird Survey, owes her career to citizen science, documenting bird distribution trends, which wouldn’t be possible without distributed observers. Close to 500 papers have been published with BBS data. She’s also interested in influencing public literacy; how do you do this when you’re working remotely with people you’ll never meet? Terry says she does that by encouraging her students not to hesitate to use citizen science data, with the caveat that you have to ground truth them and determine what data are usable.

Julie Parrish is at UW, where she started COASST to explore status of seabird populations throughout PNW, looking at dead birds on the beach. Designed rigorous protocol for identifying the birds, but also verification with photos and other data. Has been able to understand, publish, and influence management around beached birds to watch for unnatural events. Initially approached Audubon to recruit volunteers, and found birders weren’t a good audience. Needed people who were more interested in the beaches and health of beach ecosystems. She says citizen science is not just about science, but about people doing science about something they love. Learned that you can’t get people to monitor most “important” beach, but need to let them connect to what matters to them, find and celebrate their connection to the place/organism, and that improves retention.

Grupo Tortuguero de las Californias – story of Wallace J Nichols who started an EarthWatch project in Baja to look at populations of highly endangered black seas turtle. Started those expeditions as PhD student because he couldn’t fund the work due to skepticism about even finding these turtles. Partnered not only with EarthWatch but also fisherman and turtle hunters, people whose culture and identity is really linked to the sea turtles. Growing a network throughout Baja and beyond of people supporting research and management that has made a big difference in the recovery of this endangered animal. Who needs this data? Beyond scientists, the locals need and appreciate it. Wrapping up the stories with practical advice: starting these projects, don’t be daunted, basically sold everyone they knew a T-shirt to initially fund the project. Highlight for her at PPSR conference was these people meeting one another.

Reflect on where USGS is in this landscape – by holding workshop and asking questions, USGS is pretty centrally located and can help us come to a better understanding of this terrain in PPSR. Going back to analogy of landscape, how can we collectively map out and navigate the terrain? Lots of work needed just to understand this: what other projects are out there? Who is in my area or organization that I can learn from? What’s already known about how to do it well? Where do I find ideas, resources, papers? What contribution can my project make? These are questions we’re hoping to answer in part through, will be launching a project database this fall to be a centralized resource for project leaders to curate info about projects & pubs in one place.

Lots of procedural things to consider for getting from inputs to outputs. Power of citizen science is definitely in bulk of observation, and also in experiences – not just deeply meaningful momentary experiences, but experiences that let people know something scientifically and more interestingly. Your work in citizen science will ultimately influence both, but matters in how you achieve these things. Just like good science & good education, science literacy needs to be approached thoughtfully, intentionally, and deliberately.


Citizen science policies and challenges session

Introduced by Annie Simpson

Goal is clarifying USGS & govt policies, provide platform for citizen science researchers so that they can share instances of where policy hasn’t worked well for them – opportunities for lessons learned. Want to document how to work better within policy limitations and structures.


Lorna Schmid, USGS Infrastructure & Operations Team Lead

Policy Issues with Citizen Science App Development

Why do we need a mobile framework? Addressing multiple challenges and establish support tools and processes – all of the reasons we all know – unknown, limited resources or support for collaboration, funding, data sharing. Goal of project is to develop one stop shop that enables workflow processes, code repositories, mobile community, and common training.

Workshop outcomes on mobile: understanding mobile ecosystem – hardware, app development & delivery. Initial focus on app development life cycle. The process of mobile produce development – 4 phases starting w/ ideation and moving through to delivery. Ideation process includes steps for peer review to ensure that it’s not duplicative and has all the needed parts and will reflect well on USGS. Fairly technical process workflow.

Development workflow: check policy issues, develop communications and marketing plan, mobile community peer review, science review, plus security checklist and specific checklist for development document. Then review/approval for external products (e.g. created by a partner?) involves user testing, security approval, policy review, cost center approval. Once the app is published, need to monitor its performance – if it has low ratings in Apple Store, need to review the product, which reflects on the organization. Issues like scope changes – minor typos, major changes go back to development process.

Recap: training a mobile community, need this to vet applications. Recognize audience dependence and need for phased workflow with documentation. Next steps are a FY13 proposal to continuing developing framework, built out mobile community site, host town halls, continue working with team and advocate for a stronger community.


Rob Thieler, USGS Research Scientist w/ Coastal Marine Program

Researcher Point of View: iPlover: A Smartphone Application to Characterize Piping Plover Nest Locations

Found that research scientists collected data in field notebooks but most had smartphones – why not collect right on the phone? Learned several lessons from this.

Business case: sea level rising, this is a very big deal. What to do about it? Natural and cultural resources at risk – produced preliminary map of vulnerable locations. Need to inform decisions about climate change w/ uncertainty management framework; scientific priorities and practices need to change to make it happen.

Looked for poster child, picked Piping Plovers to work with USFWS and NPS, this is his cute charismatic species slide. Piping plovers prefer hazardous beaches for nesting, often subject to storms and other issues. The plovers are hard to see on the beach, nests are even harder to see but are out in the open. Got involved to support decision-making for DOI agencies; listed species, DOI mgt responsibility, interesting and specific habitat requirements that can be predicted, can then feed that back into population dynamics models. Their approach is looking at cascade of info through Bayesian Networks.

Geologist’s view of plovers’ thinking, trying to figure out how to gather data (give them a widget, not a widgeon!) Deploying trained observed and providing protocols, vastly increasing spatial domain, dealing with fuzzy observations, and already know what to collect – screens of app interfaces.

Lessons learned: HTML5 + JavaScript works best. Most devices have issues – wide variations in baseband chips, browser support uneven. Just because someone has a smartphone doesn’t mean they know how to use it! But valuable data can be collected. More lessons: little guidance on mobile apps – Privacy, PII, iconography, branding, OpenID (authentication failure for offline data collection), OMB approval.

24+ organizations monitoring plovers; USGS needs a strategic plan for mobile apps for many reasons. Project needs data to drive research & facilitate decision support. Got a smartphone app working but hurdles included lack of guidance and policy impediments. Demonstrated great opportunities and challenges.


Cheryl Smith, USGS Volunteer Program Coordinator (WebEx)

USGS Volunteer Program and Handbook

Views here are from official USGS volunteer management guidelines. HR oversees volunteers but maintains the handbook, is contact for DOI, and give employees access to vol website to post opportunities. Also fields calls and answers questions about vol programs.

Care and feeding of volunteers is done by science centers who work w/ volunteers – the fun part. Volunteers have to track time spent and turn in hours for evaluation. Dept is encouraging more bureaus to develop more vol opportunities due to budget constraints and major contributions of volunteers to USGS. Volunteers have to pass fingerprint check, be US citizens, or have a work visa. That just gives access to the building. Unless they need IT or special access to buildings, that’s the only security requirement for working less than 180 days.

Need to sign special agreement to be a volunteer, no minimum age requirement, but minors need emergency care for minors medical release. Hazard review must be conducted if minors will be involved. No underwater diving or use of firearms.

Volunteer for Science Handbook includes sections on health and safety, related to hazardous conditions, e.g. in labs or boats/watercraft, unscheduled airplanes – this makes gov’t liable for claims against USGS. Main reason to have volunteer properly signed up w/ written agreements, detailed description of their volunteer job, etc. Also includes ethics rules, volunteers have to abide by USGS ethics, e.g. not sharing sensitive data. Constraints in place for all volunteers, scientists emeriti are also part of this scene, become volunteers subject to these constraints, even as former employers.

Post volunteer opportunities on


Eric Wolf and Barbara Poore, USGS Center for Coastal and Watershed Studies

Researcher Point of View: How are USGS Citizen Projects Impacted by Government Policies?

Current policy issues – Privacy Act (1974), Paperwork Reduction Act (1995), Data Quality Act (2002) fundamental science practices, USGS Volunteer Handbook (2011). Open Government Directive in 2009 is a bit less restricted than USGS Vol Handbook.

Paperwork Reduction Act – supposed to reduce duplicative work (part of the point of some citizen science). Volunteer policy about fingerprinting and long forms really imposes a burden on volunteers. Activities addressed in the handbook give a lot of guidance. Doesn’t cover the lightweight stuff – Acadia Learning, MLMP – do teachers stand in for parents with students groups (in loco parentis)?

Citizen science is global, but people who aren’t citizens or legal aliens can’t be involved. No one outside of US can participate, and there are no guidelines for how to work with these folks. Few guidelines on mobile apps, as we saw. Also mining data from social media that are publicly shared. Concerned about potential for ethically questionable activities in data mining, even if following the law – does government have different responsibilities than corporations?

Shape of citizen science today: who asks questions? Experts. Geography of questions – local, primarily due to policies; participants also local. Knowledge/skills require being fairly educated. Consent must be documented. Research design done by experts. Geography of technology is stationary. Usability is complex (i.e. sucks). Data use is pretty much only for experts, are people collecting the data able to get the data? Basic ethos is that data are shared and are public domain.

In the future, this balance could change. Citizens could ask questions, geography of questions and participants could be global, no knowledge/skills required; unconscious consent; citizens involved in research design; portable technologies; simple usability – no one complains about something being too easy to use; community data use, let everyone explore the data.


Shari Baloch and David Newman – USGS Office of Enterprise Information (WebEx)

Lightning Talk: Freedom of Information Act – Employee Responsibilities GIP 140

3 Laws impacting science activities – PWA, Privacy Act, FOIA

Paperwork Reduction Act: requires OMB approval for structured info from 10+ members of the public annually – e.g. a form, web page survey, focus group, phone survey, mobile apps. Public is basically anyone not affiliated with a federal agencies. Point is minimizing burden on public, improve quality and use of federal info. Process takes 6-9 months with 2 public comment periods, so contact them early. If you know of existing data collection without OMB approval, contact them to figure out what the options are for getting approval.

FOIA – most records are subject to it, and USGS promotes transparency – so don’t badmouth anyone in your documents because it could be requested. Some records might be withheld through exemptions. Employee responsibilities: request FOIA rep to coordinate response because of 20-day time limit which is subject to litigation and that’s happened before. Also need to follow steps for FOIA requests (GIP 140).

Privacy Act – responsibility for Personally Identifiable Info – PII. Need to deal w/ privacy act system of records (SORN) so anything that uses unique identifier needs approval from Congress and the DOI Secretary. Suspected breaches need to be reported w/in an hour!


Paul Earle, USGS Hazards, for Sophia Liu

Researcher Point of View: USGS Tweet Earthquake Dispatch (@USGSted): Using Twitter for Earthquake Detection and Characterization

Intro slides w/ XKCD comic! Geospatial crowdsourcing – typology includes types of crowds: survivors, diaspora, social net, expert, col; types of sourcing: tagging, mapping, curating, feeding, harvesting.

TED started in 2009 after major earthquake that killed 60-70K people. USGS was criticized, people knew about it on Twitter before USGS did, it got some press. Any seismologist knows it happened but can’t estimate characteristics well enough, so that’s well enough known by the public. Question was how to make that info work for USGS – couldn’t even access Twitter from government computer!

Worked w/ USGS social media group to figure out how to get access to Twitter and do some initial test cases. Found potential ways to use it – just detection, like bloggers said – could also use to get rough location because most tweets have location metadata, sometimes GPS lat-long for about 2%, also have city names, and getting down to the city level is pretty good for earthquakes. Speed of light travels faster than the speed of sound, so using Twitter is faster than seismic stations, which are pretty sparse globally. E.g., in China, the seismographic data is delayed by 15 minutes before it’s sent to USGS, so they had no info about that huge earthquake in 2009.

Got framework set up by a computer science student, who then got a job – so they got another student, but there’s a lot of variability in student motivation and skills. After that person, got a little more money and got some professionals in on it to build a robust system to gather tweets and send out public messages about seismically verified events. That framework provided ability to build other applications off of it. Product has been rock solid, big advantage of using professional developers. One of few systems that works better in real time than on static data!

Performance: has had detections in as little as 18 seconds! Average of 3 events per day, 95% correlated with true seismic activity. Faster than $200M seismic tracking system. Alert delay is 26 – 237 seconds. Magnitude of detected events from 2.5 – 7.4 on Richter Scale. Even though it requires jumping through hoops, it really does what it’s supposed to do, there’s no way to do this as an individual researcher.


Kevin Gallagher – Associate Director, Core Science Systems

Lots of discussion about engagement driven about new technology, but citizen science has been going on very successfully for a very long time, particularly conservation efforts. So what’s really new here? The opportunity has been here for a long time, makes mission more relevant. Competing at national level for funding with every other agency. Opportunity to engage w/ citizens lets USGS showcase assets and work, and improve mission and relevancy in public eye.

What’s really new is the tech that lets us engage in new ways, e.g. mobile devices. Need diverse talents to realize the potential, how to simplify and move forward. Policy is a big hurdle, so a subcommittee that can work on those issues to help support smoother, faster approvals would be valuable. Building out mobile computing infrastructure also needed, and sharing stories too.

Some discussion of attendees’ goals in being at the workshop – appreciation from “outsiders” for being present and being able to engage, appreciation from USGS for having non-USGS people involved. Need to start getting the house in order; there’s no question that as things move forward, there’s need to work with partners and steal shamelessly from what other agencies are doing. Committed to making this successful, creating forums, investing in staff support, assist in moving policy, put funding on the line to make it happen.


Matt Cannister, USGS Biological Technician (Southeastern Ecological Science Center)

Researcher Point of View – An Overview of the USGS Nonindigenous Aquatic Species (NAS) Online Sighting Report System and Lessons Learned from Mobile App Development

75K+ records for 1K+ nonnative freshwater (mostly) species, each record with up to 60+ data fields and public can see 20-30 data fields. Also show point and HUC (watershed?) distribution maps, e.g., for Lionfish. Also does species accounts with fact sheets for public reference. Now have an alert system for Early Detection/Rapid Response aspect – big aspect of invasion biology, get rid of the problem before it’s a problem. Alerts generated when species observed in novel area, goes out through RSS feed and tweeted, also email alerts according to state, group of species, specific species, etc. Have issued almost 1200 alerts, 62% from personal communications through Online Sighting Report form, 83% of confirmed Lionfish are from public reporting. People can fill in various fields, upload images for verification, and lots of disclaimers.

Want to turn that into a mobile app. One constraint with online reporting is follow-up after field time to enter data, potential of losing details. Want to make an app like iPlover but for aquatic invasives. They have the app! Working prototype emerged in 3 months, fully functional with back-end server support. But the issues making it a “sort of” include: no technological issues, no financial issues, problems are all bureaucratic. Have been waiting since November 2011 for DOI to approve Management/Review/Approval process for app development before the app can be sent out to app stores.


Policy discussion

TOS for Apple and Google app stores are not acceptable for USGS, though some projects have achieved this one-off, e.g. NPN had one. NPN collected millionth data point, and did emergency data collection approval with OMB, had to jump through many hoops, got scolded, have 6 month approval and now have to go through the rest of those steps. Just getting sign-off on TOS doesn’t mean any of the rest of the stuff is ready to go.

There is now a fast-track for OMB that is actually working, takes a lot less time to use a web form – as little as three weeks for questionnaire approval compared to 2 years. Hard to speak to all the parts of the process, maybe need a checklist of how to go through all of these steps.

Need to avoid hiding USGS work in universities, etc. Already fighting for funding, need credit for this work. Things like mobile app dev framework will help navigate some of those other issues. Having people use universities as an intermediary dilutes credit to USGS. Need for partnerships that meet all players’ needs. Concerns about university partnerships being less stable than USGS needs, e.g., due to funding cycles.

Nothing stopping USGS from developing tools (subject to review process) and releasing as open source, e.g. technologies and protocols which are huge amount of effort to develop. That makes much bigger potential impact.

Need vision statement and direction for moving this forward at USGS level. How much would a general strategic plan help? It would make a difference to establish big questions to answer, how to work together, the kinds of communities to form, and issues that need to be addressed, e.g. policy. It could be a very simple framework, but that always helps.

When were these showstopper policies put in place? Are they outdated? Doing the workarounds could show the value of updating those policies, because that’s the really big roadblock.

What have been major issues with PII in citizen science at USGS? Wouldn’t say mistakes are made, requirements are always understood, but organizers are sometimes doing it their own way rather than going through the official processes. Some misunderstanding of what PII is – just a name isn’t PII, but two different pieces of related, linkable PII is. Lacking a unique identifier makes the data problematic for research use because of data quality concerns.

Policies so complex (and numerous) that there’s room for multiple interpretations.

How can USGS scientists figure out policy and resources to recruit effectively within policy guidelines? Hard to figure out what the full process is, right now it’s word of mouth, don’t know where to go for that info. There’s a new website to try to help document all that info, building a one-stop-shop.

Memo from OSTP – things like 3rd party ID systems embedded in them, but virtually no one seems to know about that memo. Also don’t know when there are updates to policy – keeping people up to date on changes. Collecting up the policy knowledge in the room would help generate a pretty comprehensive list.

Many of these issues are burden to domain researchers, they shouldn’t have to deal with all the policy and other related stuff outside of their areas of expertise and be pulled out of their research. Same issue in data management. Need an ombudsperson or liaison to help researchers navigate these issues.

Sally Holl discovered Code for America, suggests creating Code for Science proposal. Science sector lags behind other areas, and a lot of government data is not available, that which was is not all that usable. Need to be able to host hackathons with staff providing bridges between domain science and IT development. Doing this would require hiring a group rather than individuals who all need to be fingerprinted, etc. contest – USGS has never done this before, Abby is working on putting a proposal for a challenge together.


Engaging the Public in Scientific Research Session


My talk – The Evolving Landscape of Citizen Science


Barb Horn – State of Colorado (WebEx) NWQMC

Engaging the Public in Scientific Research Perspectives from the Volunteer Monitoring Community

Volunteers as workforce – different model and set of resources to support; involving volunteers makes it more than just research. Volunteer monitoring community is sort of its own community, especially water quality monitoring. Trying to explain to National Water Monitoring Council, the volunteer monitoring projects do pretty much the same thing as professionals. Data management and technologies are an issue as for everyone else.

Started out as primarily education and stewardship, but has moved toward decision-making and taking action. Planning phase steps really don’t show how to deal with the people part – figuring out who the decision-makers are, what info they need, and that’s what should drive the rest of the planning steps.

Planning process starts with what exists, formulate objectives, how to manage and analyze, how to deliver to decision-maker, evaluation of project success, what needs documentation, and what’s the appropriate network to use those assets. General approaches are tiered – based on amount of rigor and time involved. The advantage of the simple starting point is that it’s a good starting point and you can potentially move people up into a higher tier, but the disadvantages have to do with data interoperability, not the right level of data for the particular location. Other approach is one-size-fits-all, all the resources go into one product/service/deliver/support, supports interoperability, but costs more up front, may lose people who want to do the more rigorous work, and might not meet all the objectives.

River Watch is a one-size-fits-all. Annually they monitor around 700 stations a year, including field, lab analysis, physical habitat, macroinvertebrates, photo. Everyone does same training and if they can’t meet QA/QC cut-off points, they can’t continue to participate. Equipment issues – loaning, giving equipment w/ either the group keeping the equip or they reclaim it, etc. Standardization is required across methodology for this kind of data.

Analysis approaches: some do it all, others collect and hand over to experts, some do a combination of those approaches, but in all cases it has to be standardized. Study design may be very different for different participant groups, and sometimes a group starts with one study design and then change the design to match the outcomes, then maybe do intervention and later go back and re-check. So monitoring can evolve and study design changes. Survey on demographics and motivations; main purposes are passion and making a difference. Should document volunteer responsibility, what sponsors support, and responsibilities – typical volunteer coordination considerations.


Greg Newman, Colorado State University Cyberinfrastructure Support for Grassroots Conservation, Citizen Science, and Community-based Monitoring

Learn from the masses, and then work with them – the art of teaching is the art of assisting discovery – these are things that make citizen science work. Keys to success: volunteers, scientists, & mavens who make the projects happen; need all three roles to make a project work.

Today we have countless projects, creating volumes of data, and they often focus on pre-defined topics – birds, phenology, etc. What if you want to measure something else? With that question in mind, felt there was a need for a platform to help new projects get going. Got NSF money and built such a system, facilitate creation of projects, encourage cross-project partnerships, support data management. Platform supports both top-down and bottom-up approaches to project design. Steps in project development are far more complex than most people realize.

Simple web portal for setting up projects – personalizing with photos, etc, build their own data sheets, etc. System allows membership gatekeeping to help support data quality, build data sheets from typical template items, add specific attributes and site characteristics. With specifying attributes, they have a growing number of attributes to manage, e.g. species attributes like height, weight, etc. The goal of doing this is standardized, interoperable, and flexible data – integrating data from Pika Watch and Pika Net, all adopting same protocols. They determine what to measure – so far, species attributes and site characteristics. But it could be kW energy use in houses as a new attribute in the system. Their mantra: If you want to measure it, you can measure it, but be sure no one else is already measuring it!

For data analysis, there are automatically updated canned reports, but working to build custom analysis where they can pick their own variables. In the future, hoping to develop open APIs and extend interoperability. There’s also a feedback tool for surveying volunteers. So far they’ve had 40+ projects, 7K+ observations, tech reports, academic publications, etc.

Making online citizen science successful: shared praxis, amplified collective intelligence, restructured expert attention. What motivates people? Sharing ownership, prompt them along the way, reward and respect them. Are rewards effective? Communicating and sharing results is number one – that’s why they’re participating. Show progress, make sure they know they’re making headway toward an answer to the research question. Give attribution – really acknowledge them. Harnessing these motivations: adopt common protocols, encourage standards while remaining flexible. How to succeed? Train the trainer, training a coordinator who trains volunteers, crowdsourcing to distributed groups. Each model of participation has their place in the bigger picture.

So what’s next? Event management, online training support, volunteer tracking, improving project social media marketing, integrate w/ SciStarter to promote projects. Expand analysis, customized reporting, mobile apps, and blogging features.


Greg Matthews, USGS National Map Corps

Volunteer Map Data Collection – The National Map Corps

National Map Corps – volunteer geographic info used for improving topo maps, but eventually got shut down. Restarted in 2010 with OSM; started with software – OSM had quite a few advantages, but needed a willing partner, which ended up being State of Kansas data access and support center. Took their data set and cross-walked to USGS best practices model, then loaded into the system so editors could edit roads. Kansas focused on state routes, while USGS focused on interstate routes. Aside from a few hitches, it has worked well.

Guidelines for editing also important – they had the same reasons for doing the editing, which is fairly rare. Successfully deployed the system customized to USGS specs, worked with co-editing using cross-agency shared specs, and used lessons learned for phase 2. This is a large federal agency, but had 3 staff w/ about 1/3 of their time each, so it was a large project.

Phase 2 involved structures (buildings), volunteers from student bodies, evaluating quality/cost to ensure there would be ongoing internal support, also very important to make it available to The National Map and meet all their quality needs. Still successfully using OSM technology stack, and trying to contribute the data back to that community as well. Editing interface was much improved using user-centered design. Contributed points include a number of attributes. Was specifically collecting the info for the structures program. Seeded project area with gazeteer info on structures, shown in red, checked over by editors and then approved or changed, and then shown in yellow, so you don’t get duplication of effort, map itself shows status of editing tasks. Also involved peer review – editors checking each others’ work.

Outreach and communication was easy using Google Sites, so it was unofficial and only available to those who were invited to participate. Forum was also set up in the hopes of reducing volunteer management load as well as collecting feedback. Basic thank-you certificate of achievement, considered it the least they could do, certificate shows how many structures were included on The National Map – pretty popular with some people.

Had over 1100 points contributed or edited, lots of new points added, and some historical points removed. Positional accuracy also improved – from 50% accuracy to 84% accuracy, and then after peer review 92% accuracy without USGS involvement, which met National Map standards of 90% accuracy w/in 40 feet.

Phase 3 – now collecting a new type of data guided by needs of The National Map. Content management in Confluence, with directions, video tutorials, no login and now official with approval. Editing interface is purpose-built to make it as easy as possible to contribute to the system. Adopt-a-Quad system lets people review all the data in a quadrant, more technical job and asks a little more of those people, going well so far. Volunteer contributions after public announcement 3 weeks ago: about 1500 data contributions, almost finished half of Colorado in very short order. Hoping to scale up beyond CO in next few months for these 10 structure types, potentially serving a larger set of purposes beyond USGS.


Kristi Wallace, USGS Alaska Volcano Observatory

The Alaska Volcano Observatory Citizen Network Ash Collection and Observation Program

Collecting observational data, no outreach people in this program, so it’s an extension of their work that they’re taking on. Research focus is on volcanic ash, involving volunteers in collecting observations and samples of ash. Over 100 volcanoes and most are distant from Anchorage, so they need distributed contributors with timely access to ash plumes. The data are used to ground truth satellite data and ash fall models, and another important need is met by collaborating with NWS Ashfall Advisories, since they are the agency that releases those advisories in real time. So timeliness is really important. The data points are used for isomass maps and samples let them understand composition of the ash. Aleutian Arc volcanoes require volunteers to collect data.

The ashfall models are not yet public, but that ground truth validation is really needed, so they can be viewed by the public during eruptions. They use a variety of training forums, with written instructions, video, workshops, and targeting key citizen groups. Take a variety of observation and sample types, complex to simple; eyewitness accounts, thickness measurements, measured-area sampling, time incremental sampling, bulk sampling. Give people the choice to report what they feel able to do; provide very explicit and simple instructions for all those activities.

Website includes dashboard for each volcano – intro material, reporting, and send email. They put instructions in a variety of media to make sure everyone can use the format that works best for them. Instructions include lots of pictures to make it very obvious, diagrams, etc – might feel like you’re insulting people, but a lot of people just don’t know how to measure a baking pan’s dimensions. Data sheet includes metric ruler because people use inches when they mean centimeters. They also hold community workshops in downwind communities, great way to get long-term participants who become really loyal and they keep in touch – she calls in advance when precursor activity starts up so they can refresh themselves on their procedures.

Also working with National Weather Service Spotters, who are highly trained, just as excited about volcano science as weather science, and they tend to be very valuable. Lots of observations and samples sent in – about 250 observations and 55 samples were sent in. They send out hand-written thank-you’s with pics of the eruption that they sent a sample for to make sure people know it’s appreciated. Sometimes samples aren’t well documented, but there are some that demonstrate extremely careful documentation of their samples and procedures.

Just added ashfall recording page. Don’t have an outreach person, she has to be in the field during eruptions or have volunteers to do it, very stressful. So they put together an online database, 4-step form based on data that are entered. Time/date, several ways to identify locations, yes/no on whether ash was present, details about the ashfall event, and then if you sampled, more questions on the type of sample and comments. Optional contact info – need to be able to follow up on many of the observations for weird reports or special data collection, allows people to opt out of providing PII. Have a map that shows 24-hour reports of ashfall, shows only yes/no whether there is ash or not, and locations aren’t very specific, but specific enough to be useful for public.

Can’t use this system yet due to bureaucracy, but designed it in a way that is not volcano-specific, so it can be used globally. Gave a copy of software to New Zealand to use – they will probably have it running before the US does. Back end tracking with CRM-like functionality, etc. Image database now available, growing fast, and photographers love that they get credit for it.

Outcomes for and Benefits to Participants

Conference on Public Participation in Scientific Research, Day 1 Session 3, 8/4/2012


Building Evaluation Capacity for PPSR
Tina Philips

Focus on evaluation and why it’s needed. Running Nestwatch piqued interest in evaluation, happens in many contexts – pretty much every sector does it.

Many reasons to evaluate – why not do it? Much to gain from better understanding impacts. What evaluation is not: an audit; assessment; survey – biggest misconception, key to process of evaluation but not the whole thing; research – goals, audience, end products are very different. End goal of evaluation is improving something. Evaluators are not dementors!

Evaluation is systematic collection of data to determine strengths and weaknesses of programs, policy, products, so as to improve their overall effectiveness. Involves planning, implementation, and reporting out – similar to scientific research methods and does use similar methodologies. Takes into account stakeholders, all of them. Because stakeholders and contexts are unique, every evaluation is different.

When to evaluate? Many times – Front-end, formative, summative. Questions about what is evaluated? Individual outcomes – cognitive, affective and behavioral. Also programmatic and community-level, but focus here is individuals. Reason to look at this is participants are people, not technicians or laborers, they come to interact and do something meaningful. We owe it to them to let them know what they’ll get out of participating, and evaluation is needed to understand.

Challenging work – main reason it doesn’t happen is time and money constraints, and many PPSR leaders are interdisciplinary – not trained in evaluation. This is the reason for the development of the DEVISE toolkit to help non-evaluators conduct quality evaluations.

For evaluation and design, really important to know goals, outcomes and indicators. Goals are broad, outcomes are more specific, and indicators are the evidence of outcomes. Common pitfalls: wishy-washy outcomes, not aligning outcomes with activities, expecting too much of project, expecting learning through osmosis, not providing support for learning – including behavior change.

Intro to basic DEVISE framework: behavior & stewardship; skills of science inquiry; knowledge of the nature of science; motivation; efficacy; interest in science & the environment. Work in progress, but toolkit is going to address these domains & constructs. Shouldn’t try to evaluate them all, choose and align to the project itself.

Take aways: evaluation is doable, can improve your program, improve chances for sustainability, lead to best practices, and demonstrate impact as a field.

Understanding the Connection Between Participant Motivation and Program Outcomes for Effective Program Design
Kris Stepenuk

Started working with water quality monitoring as a kid, family activity based on concern for kids’ health. Outcomes were identifying hotspots along river for contamination, which it did. Now she coordinates the program, looking to understand motivations, outcomes based on literature, and what we don’t know. Challenge: become researchers of the discipline.

Presented motivations for her project; social outcomes are important parts of motivation. In general, motivations tend to be altruistic and/or related to personal learning.

Indian Country 101: Tribal Communities as Partners in Environmental Restoration
Chris Shelley

If you want to do PPSR on tribal lands, you need to understand the needs and context. In the Columbia River Basin, salmon is critical – 30% of calories in diet, 300 lbs/person/year, they consider themselves salmon people and they take care of the fish. But salmon are in crisis, and so are the communities – what makes them who they are is disappearing.

Was part of the “Salmon Corps” which has 7 site locations and is part of AmeriCorps – map of 4 reservations and their ceded land that was given up in treaties of 1855. Treaty tribes didn’t cede all lands, also retained rights for fishing. Salmon Corps did restoration – fencing pastures to keep cattle out of streams. They also got college credit for work, so it wasn’t just labor but also education.

The Corps members embraced hip-hop culture, wearing jeans halfway down their ass, but doing restoration work helped re-connect them with their culture. Did a culture camp where they learned how to do traditional tribal skills, and they did a lot of cool stuff that were important services: wolf introduction, native plantings, restored habitats, assisted people during flash floods, etc.

But again, didn’t give up fishing rights despite ceding lands, so they have the right to co-manage the salmon resources. They get to do cool stuff off-reservation to help manage salmon, which sometimes butts up against what scientists think is right due to a cultural gap for what is appropriate in science. There was no salmon in the Umatilla River because of land usage, but they restored it back to a natural salmon spawning stream. Great quotes from participants about the meaningfulness of this work: “I know I need an education, but I also want to help the environment and help my people.” – Jeanine Jim-Bluehorse

Most people live near a reservation for which tribes still retain some rights off-reservation due to interpretations of treaties by Supreme Court. Still have access to resources like water, so they have the right to manage those resources – so how does this intersect with PPSR? Hope there are things you want to know about the traditional lands of indigenous people and you’ll collaborate with them to help them manage their resources and help you learn things about the resources that you couldn’t know otherwise. Believes salmon crisis cannot be involved without tribal partners being central. Their input will upset some scientists because it’s based in traditional knowledge, not Aristotelian. It will be hard to reconcile, but it’s still worth doing.

Working with these groups will be frustrating to outsiders but incredibly mutually beneficial. Wherever you have cultural diversity in a stable community, you also have biodiversity – this needs to be preserved and supported.

Citizen Science: Science as if People Mattered
Raj Pandya

Very funny intro! We should look at participants as partners in science, not as people doing our science. Science developed with communities, in the context of communities, doing things that communities can live with.

Whatever you call it, PPSR demographics show under-represented groups participate less than majority groups, less affluent participants also outnumbered by affluent ones. Huge group of people are not at the table, and if you’re not at the table you’re often on the menu. Why?

Many issues of access, these are the easiest to fix. Gets harder as you go down the list – cultural barriers can be solved with time and effort. Relevance is most difficult – are the problems investigated by citizen science aligned with community priorities? If we keep on this way, we’ll continue developing “whitey” programs, no offense intended.

Student project in Louisiana Delta, called Vanishing Points, with mobile phone app where people can collect stories/images/etc for culturally, personally, economically important places, and look at what’s likely to happen to those places. Another set of projects around wild rice in the White Earth nation. Third project working on managing meningitis in the Sahel. Meningitis is epidemic in this area, every few years cases spike, lots of mortality and disability. Everyone who lives there tells you it’s a dry season problem, and when the rain arrives the problem goes away. Using this knowledge is really important for effectively distributing limited vaccine supplies.

Steps to take: Align research with community priorities – requires working in interdisciplinary teams and talking to a lot of different community members. Plan for co-management – something is going to go wrong at some point, and you need a plan for trying to deal with that. Incorporate multiple kinds of knowledge – Chris already covered this, just need to harken back to that sense of humility and make space for other knowledge to be relevant and important to the project. Communicate: often has to happen in really small settings, constant work day after day in community settings. It’s really all about engaging the community at every step of the process, deciding what counts as data, what data means, how and when data will be collected, what data is appropriate to share, and working with communities to apply that data to their needs.

By paying even more attention to doing science with people, citizen science can provide a model for making science more relevant and useful.

PPSR’s Contributions to Science

Conference on Public Participation in Scientific Research, Day 1 Session 2 – 8/4/2012


To Use or Not To Use: Is That the Data?
Terry Root

Examples of PPSR biogeography – huge collections with enormous monetary value. Can go back to the 1800s to understand egg laying for phenology changes. Everything was fine with older data – 800K eggs have been digitized, but newer data is problematic because eggs couldn’t be collected. So now nest record cards need to be digitized. These data were used to save peregrine falcon and brown pelican from DDT risks.

Starting in Victorian times, people were interested in learning more about what they saw but no field guides available – so Arm & Hammer distributed species cards in their baking soda! Has evolved into Christmas Bird Count through a circuitous route.

Once data were computerized, they started to find data could be used very well to answer RQs. Birdwatchers knew more about irruptive phenomena than scientists, probably caused by climate. Was then able to look at distribution and abundance from CBC data and find out the range change for species.

Many other data sets go way back, priceless info saved by hiding under a mattress! Many of the historical data are in private and museum collections. Big growth in large long-term datasets that are badly needed to address big questions. We are now poised on the edge of a huge explosion of data, but what does misidentification mean for data quality? We used to throw away these data, but now we can use smartphones with cameras and social networks to get info about many things from large number of people into scientifically useable datasets. iNaturalist a great example of how this is working really well – already seeing exponential growth and going viral soon. 3 yo can use it and get excited about finding out what a species is.

The Role of “Citizen Science” in Weather and Climate Research
Noel Doesken

Early traditions of weather observation started in US by Ben Franklin and Thomas Jefferson. They communicated to try to understand what was going on, but didn’t have spatial and temporal context. Smithsonian project from 1849 introduced new technologies, telegraph to share weather observations.

Analysis and interpretation of volunteer data more difficult than recruiting volunteers, getting as many as 500K data points/year, which was when standardization became a big issue. Took 12 years to report on the data, so volunteers had to be very patient.

Colorado state weather service started establishing state-based weather observing networks in the late 1880s with only $2K. Within 10 years there was a solid reporting network, led to nationwide “Cooperative Network” that continues today. First purposes were simple – climate resources of the country, particularly what crops could be grown where and when, also equally important to predict extreme weather.

In bigger picture, most of the data are very skewed geographically, but a very impressive foundation. Many applications, both scientific and practical, such as climate and health – “a stinking big deal.” Have learned from historical data that there are weather cycles that is helping model weather. Dustbowl and depression increased interest in weather and climate, advanced use of volunteer data. Majority of drought monitoring is from citizen science.

Naturally this leads to understanding climate change – CoCoRaHS is keeping this going. We need more rain gauges and finer granularity of placement.

Foldit and Games for Scientific Discovery
Seth Cooper

Many people playing video games, what if we could direct that to solving problems for science? Combining human and computational power, as well as a way to motivate people to engage and solve problems they didn’t think they could contribute to.

One area where there’s lots of potential is biochemistry – proteins and protein folding. Very important part of life. Two ways to look at it, sequences and 3D structures. Hard to solve folding problems algorithmically. Foldit lets gamers use the 3D visualizations and both human and computational tools to solve problems. Have had 250K people play the game, over 100 protein structure puzzles.

Scoring and leaderboards help promote competition which motivates gamers. Technical structures are complex, but robust for solving difficult puzzles. Constantly releasing new features and bug fixes, and giving players feedback. Worldwide community participating, multiple languages. Players have produced very exciting results, protein related to AIDS virus in monkeys, algorithms failed but players succeeded in 3 weeks!

Ended up implementing scripting structure for “recipes” so players could reuse functions – player algorithm independently discovered scientist algorithms, perform better! Made trophy for early winning player, keeps it on his desk. Also co-creating structures, now making interface tools for scientists – when tool is fun and easy for everyone, also useful for scientists.

The Many Benefits of PPSR
Linda Silka

CBPR – community based participatory research – no research on us without us. Academic research may not be the right way to address problems.

Working with tribal groups on emerald ash borer in Maine – not many ash trees but they are critical to tribal traditions and economic opportunities. CBPR is growing just like citizen science – there are organizations, journals, and grants for training and cross-disciplinary support.

CBPR successes – adding rigor to data collection, need to merge professional and local knowledge to solve problems. Example from tribal lands in Nevada about nuclear contamination – researcher vector model didn’t take into community food sources. Other examples of ways that community knowledge is strengthening scientific outcomes, e.g. incinerators and air quality household health studies – very concerned about children but had concerns, wanted dialogue with researchers. Similar outcomes in emerald ash borer studies, nutritional studies.

Linking knowledge to action by bringing in local stakeholders; federal research agencies/foundations reviewing proposals differently to promote broader impacts. Using research cycle as tool to understand issues that emerge at each stage. Many questions remain about assumptions and unknowns.

Lots of resources at

Looking Back, Moving Forward in PPSR

Conference on Public Participation in Scientific Research, Day 1 Session 1 – 8/4/2012


PPSR: How We Got Here and Where We Go Now
Abe Miller-Rushing

Exciting to bring wide range of disciplines together and develop a more global perspective. Take-aways: PPSR is not new, has always been important to science, and is growing and innovating very quickly.

Models for PPSR – taxonomy of projects by degree of involvement of participants, contributory, collaborative, and co-created.

How we got here: Science began as amateur research with Plato and Aristotle, often by rich people who had money and time. Some of the most important science has relied on public participation, e.g. Linneas. Professionalization of science has marginalized public participation and that’s where we are today.

Nonetheless, PPSR has continued, just not always labeled as such or recognized as broadly as it should be. Originally it was often specimens, but now it’s usually observational data. It’s also used to solve local problems, and that’s an important role. Big data sets are also being generated through citizen science, some of these are the most important for their field, for example NOAA’s weather data that is being used to understand climate change.

Recent developments: huge improvements in tech, communication, data storage, analysis & best practices, this kind of revolution is not without precedent. Another big advance is with explicit participant-focused outcomes, which is still fairly new.

We need data over wide time periods and geographic ranges to achieve many of our scientific goals moving forward. Things like fine-scale weather observations through CoCoRaHS which is really important for decision-making; looking at changes in phenology to understand climate change and losses in biodiversity, e.g. findings from looking at Thoreau’s data through time with current citizen science data.

Many applications: images and sound analysis; real-time data for near-term predications; collection and transcription of historical records; health and environmental justice.

Huge growth in PPSR recently, e.g. with ISI on peer-reviewed publications – exponential growth in last 6 years. More to come as cross-disciplinary dialogue, collaboration & innovation develop. Now we see a need to formalize and support the field and practitioners. Still getting push-back at NPS for making management decisions based on PPSR data.

Where do you want PPSR to go? What does the field need? What do you need in your role in PPSR? What should an organization for PPSR do? Poster session opportunities to post your responses to important questions. Will be using this feedback in closing session discussion – please participate and help us act on these recommendations from the community.

Q: Issues – recognition of citizen science and use of the language in publications – what is this doing to help legitimize PPSR.

Grand Challenges and Big Data: Implications for PPSR
Bill Michener

Challenges we face, scientifically and technologically, focusing on data issues as that’s where the rubber hits the road from a science perspective. Many issues we are concerned about, primarily related to climate change, clean energy, and so on. We’re in a new age where we’re hitting some tipping points and likely to see very abrupt changes that will have significant impact on future and quality of life on earth.

Many tools being used for data-intensive science, but data management is one of the challenges standing in the way of results – we need to speed up time to results and reduce time on mundane tasks like data management. Another key challenge is expanding participation.

Major concern – where are the data? We need to be able to integrate data to address major scientific challenges. This leads to the long-tail distribution of data problem with many data orphans. Jim Gray – “Most of the bytes are at the high end, but most of the datasets are at the low end.” Brings up an important question – we’re all familiar with the research life cycle, but how do we link it to the data life cycle?

Solutions: DataONE is addressing some key issues, e.g. data preservation. Intro to

One of the main science data management/analysis tools currently in use is Excel [shifting toward Google Docs]. D1 is developing tools for R, which is one of the second-most popular tools for analysis, working with many partners.

Issue 2: Data discovery – not easily found with traditional search tools. Major project of D1 has been ONEMercury for searching across datasets.

Issue 3: Tools for innovation and discovery. We’re in the 4th paradigm of research, focus on data-intensive research that requires new tools, techniques, and ways of doing research. Another way the investigator toolkit fits in to address this question. Examples include DMPTool, data management planning tool – helps get grants funded for agencies requiring data management plans, but should be a consideration for PPSR projects. Supports 12+ templates required by different agencies with walk-through series of steps to address required points for data management plans. At the most, all you need to do with this after going through wizard is change font.

Upcoming tool: DataUp to check Excel spreadsheets for best practices, create metadata and connect to ONEShare, one of the D1 repositories – all for free.

Finally, need for tools for exploration, visualization, and analysis. Example needed data layers from several sources to address research question, one only found through word of mouth, had to develop new modeling tools and work with new tool (VisTrails) to develop visualization.

People and Participation: Educational and Community Components of PPSR Projects
Heidi Ballard

Came to PPSR as HS teacher, then worked in community-based forest management, and then working with science education at UC Davis.

Need to look at PPSR across different practices. Many disciplines and goals represented here: biochem, ecology, astronomy, nat rsc mgt, and public health. Also have: psych, sci & enviro ed, social justice & community development, sociology, anthropology.

Names for PPSR categories are called different names by other scholars, this is explored in recent Ecology & Society article. Need to think about more than degree of participation and what part of scientific process. Other important aspects related to quality – Whose interests are being served, and to what end? Who makes decisions? Who has the power?

This leads into discussions of democratizing science – role of PPSR is improved science understanding for everyone, which results in better research. Examples include LiMPETS monitoring 600 miles of CA’s National Marine Sanctuaries, students taking it very seriously because it will be used for science. Additional examples related to rice growing and health impacts.

Looking at social and educational outcomes of PPSR – individual, programmatic, & community level. Often PPSR focuses on programmatic level – audience reach, engagement, program strengths/weaknesses, etc. Current work focusing on individual learning outcomes, and community-level outcomes are exciting area to develop – social capital, community capacity, economic impacts, trust between public, scientists & managers.

Her main question is: if we think about intertwining of social & ecological systems, many stakeholders involved, can we improve their resilience?

Trip Report: London Citizen Cyberscience Summit, Day 3

My final set of notes from the summit on 18 February are below. They only cover the morning talks as I spent the afternoon in discussions with other attendees. Apologies for typos or bad formatting – typing on an iPad leads to weird autocorrects or missing spaces, and Posterous – well, don’t get me started. I hope I have time to migrate out of it sometime soon.



Open Knowledge Foundation

Supporting reuse and remixing of data and content, permission is a major impediment to innovation. Artificially intelligent chemical software to extract data from software but can’t use it due to publisher licensing, technology is stalled due to antiquated IP. OKF is a call for people to gather around the meaning of openness and how to make knowledge open. Not a campaigning group but looking at how to create tools and infrastructure to get information out to as many people as possible. Example of malarial research, frustration that no one can read the literature for free. Many people don’t read the literature because they can’t afford to. This is relevant to medical, climate, and development contexts. Trying to change the culture so that it becomes the norm that people have the right to access to the scientific literature. Working on this through scientific tools, one tool is Open Bibliography, makes reference collections completely open – just the list of references. Making reference lists available is a valuable resource in itself. Emphasis on high quality research creation and software for infrastructure across disciplines. OKCon2012,


Cabell – Online Collaboration and Legal Concerns

Legal overview: global collaboration is confusing; local not global laws which means some are regional and others are enforceable in different areas. Default rules that exist around ownership and control of distribution of work, have to actively change these settings, usually in writing, they are inconsistent rules with limited interoperability, generally law provides ability to exclude but not engage, and prior rights may limit use of your own work.

IP basics – patents, very expensive, cost about $12K, has to be new, useful, and not obvious, which is a lower bar than it sounds like. What is protected is method or process. Newness leads to embargoes on publication, have to file before publishing or you lose rights, so this holds up dissemination of research. Rights are to make, use, sell unless blocked by existing patents. Patent ownership is to actual inventor; if you invent on the job, it’s subject to shop rights. Lasts for 20 years.

Trademark is about identifying source of goods or services, usually a registration process but not in US, have to show public recognition of brand and limited to class of goods and services. Term is as long as people recognize brand.

Trade secret – lasts as long as it is secret. No legal definition, NDA keeps secrets, no right to use information, but for preventing exposure.

Copyright protects original expression, not idea, not statements of bare fact. Eligibility criteria is low, but different – intellectual effort vs sweat of the brow. Annual contest, Buller-Litton, for worst possible writing you can produce, example of expression versus fact. Facts are free to use without attribution, therefore not copyrightable, but data aren’t necessarily limited to facts so beware of underlying rights which may bewildlydifferent depending on the type of work involved – e.g., a Db of photos has different rights than Db of numbers or CDs and songs. Not (always) true that data aren’t copyrightable. Collections of facts are not copyrightable, but collections of Xrays are. Rights are prevention of copying, distribution, derivatives, translations, display, public performance, related rights like moral rights including integrity (intact w/o change). Rights differ based on type of work, e.g., artistic versus literary which includes software, Dbs, texts. Dbs can be copyrighted as a compilation, collection s of elements which are not individually copyrightable. Copyright is automatic the moment the creator fixes it into tangible format. Contributing thought not the same as expression, so coauthors who write nothing have no right to copyright (watch out, PhD advisors!) Works for hire automatically belongs to employer if created as part of job duty. Universities have different policies in this regard, e.g., as to theses. Funding sources can impose ownership and publication restrictions, e.g., funders requiring deposit of data or outputs. Specially commissioned works, e.g., from freelance and consultants – wedding photographers own copyright, do not belong to commissioner unless agreement in writing. Government works – federal work is in PD in US, so no one owns it, Crown copyright in UK. Types of joint ownership – unless group of collaborators think of work as a single work, then they are not really joint authors; if they do, each author has right to sell the work. If only some authors consent to combined use, then it’s a compilation or collective work, only the combination or part that is newly created is owned by the compiler. Duration of copyright is very complicated! SGDR – parallel to copyright, subject to abuse.

Other issues to consider: privacy tightly restricted in UK but hardly protected in US. Limited piecemeal protection in Us. Discrete bits of info may not reveal an individual but a combination of sources can, which runs a risk when combining databases. Main question is where you operate, if you operate in US but take data from UK, you are subject to UK law.

Other related random acts: human subjects research, public sector info, species and environmental info acts, import/export acts e.g., software is an armament, child protection laws, national security, institutional and professional ethics.

Implications for citizen science: usually no legal entity for voluntary collaboration, so that means no centralized management or ownership can take control of IP, only a person or business can own something. Default settings may be inconsistent with community’s intended uses of works, individual contributors can make decisions without consulting whole, piecemeal and distributed rights. In addition, law treats collaborators as joint offenders, individuals not protected from liability or harm done bothers in collaboration, e.g., copyright infringement. So one member can beheld legally responsible for harm done by others in the collective. Best legal practices: know own rights; document each contribution as well as possible like with version tracking that helps ID author, location and date; where possible, formalize collaborative organization to simplify legal application; carefully specify collaboration rights.

Open sharing has lots of standard public licenses like OKF and GPL and CC. OKF has reference list that shows how open licenses are and what they apply to. Linked Open Data efforts being used to facilitate sharing. CC applicable in 75 jurisdictions. Recommend CC0 (public domain) for data, attribution becomes too difficult. Natural history observations are considered statements of fact and not copyrightable, but comments about them would be copyrightable.


Plantin – radiation mapping in age of bad data

Post-Fukushima: initially no data, but then bad data. Worries about sensitivity and then mishandling, data produced by entities whose motivations could be questioned. Several radiation mapping mashups. Mapping radiation was a 4-step process: 1. scrape it directly from websites, but initially unstructured, read through source code. 2. Measure it, many people tried to do this, could be done by many different groups or organizations. 3. Aggregate it, e.g., with Pachube, which is platform for online aggregation and redistribution through API calls. 4. Map it. Examples of only official or only alternative data, but more interesting is them mashup using both sources. Also useful for verification. Focus on monitoring group, SafeCast, ad hoc group of engineers in Tokyo, hard to ow if it is science, not planning to intervene, only trying to provide data and trigger reflexivity. Not activists. Hackers but not hackers, tinkering in DIY way, but close to community, so crossing the dynamics of science, activism, hacking, community.


Ishigaki –

Making radiation data available to public. Created device that is housed in a candy box “Frisk”. Used it because they had no time or money for plastic injection molding. Hooks into smartphone, 4 color variations! (much laughter) Much better than 40-lb Geiger counter. Free iPhone app, Pocket Geiger, takes 5-10 minutes to analyze your data. Factory right outside of tsunami disaster area, but income went down, so their nonprofit organization creating many jobs for disaster recovery. Socially inclusive, 3 core members, 5 professionals (pedologist, Dutch DoD, Dutch NIST, NASA, Japanese CERN), 12 hackers, 10K+ users. User reports on FB group, radiation levels high in children’s park, drainpipes, very high inflight. Have millions of data points but now running into privacy problems. Cities creating monitoring posts for radiation, specialists going to create high accuracy devices, but need to know radiation levels in own homes as well. Hates this governmental model where citizens have no access to data.

Issues about inconsistent measurement by contributors, need metadata or it’s not usable. Not even units or measurements, but also environment in which measurements were made.


Maisonneuve – Public analysis of satellite images

300K damaged buildings to assess (from earthquake?), difficult to do by professionals due to scale. Organizational issues, how to organize non-trained volunteers to enforce quality and analyze a large area, either remotely or in the physical world.

Parallel model, n volunteers monitoring the same area for inter-rater reliability. Another model is iterative, annotation and progressive improvement like Wikipedia. Experimented with these approaches in 3 maps. Types of errors, false negative and false positive. Parallel model is eduction of false detection rates, redundancy useless if at the individual level p=1, p=0.5, want only consensual results, doesn’t solve problems of omission, agreement on obvious buildings but not difficult ones. Sensitive to aggregation parameters. In iterative model, somewhat reversed, less omission of buildings so better area completeness. Sensitive to destruction of knowledge in a basic implementation (last=best), very sensitive to initial conditions, so first player is very important – maybe need experts on this part.

Skill is an issue, how many volunteers needed to reach a certain level of quality? At some point, you get to a point where you can add more people but there are problems of scale,quality canbe replaced by uantity. Issue of complementarity, aggregating results of the test contributors, individually not all that great but together you get much more value.

Second question about training volunteers, ongoing effort. Difficulty of task can be assessed according to agreement, easy tasks have high agreement but difficult ones have more spread. Last point is that crowd learning can happen through learning through others mistakes, can identify most common errors and use this density of errors to use information to educate people according to errors of previous contributor errors.


Foster – Project Seahorse: advancing marine conservation

Based out of UBC, committed to sustainable marine ecosystems. Know what is wrong, but have to figure out how to fix it.

Why seahorses? They are really cool fish! Most people don’t know that they are fish, have a bunch of cool evolutionary features (horse heads, marsupial pouches, prehensile tails) only species where the male gets pregnant. Seahorses are like a panda, no one cares about mudflats or mangroves until you tell them that seahorses live there. So saving seahorses means saving their habitats.

Threats include overfishing and target catching by small scale fishers, majority are caught as bycatch by shrimp trawlers, discard thousands of tons a year. Like using bulldozer in a forest to catch a deer. Threats to seahorses are threats to oceans and other marine life.

Captured seahorses are traded internationally, especially for Chinese traditional medicine, curios, aquarium trade. They retain shape when dried, so they are interesting curios like seahorses with fish fins guard to them like wings clutching mini tequila bottles. Trade is large and global, 10M sesahorses around world in 80 countries, so it is one of biggest species trade problems. CITES regulates international species trade, all 46+ species are listed in Appendix 2 which means international trade is permitted but regulated, so have to prove sustainability. Seahorses are one of most important fishes on CITES, first fish listed, previously not considered a species for international regulation, immediately after they were listed several other fishes were added that are traded internationally.

Problem is lack of location-specific information about seahorses to help groups meet mandates for demonstrating sustainability, IUCN red list shows most of the species (28) are DD – data deficient – all 8 species are EN or VU, basically very threatened or endangered, chances are good this is true for other species.

Can’t spend her life diving to find seahorses around world for lots of reasons, most important is lack of time because we need to act now. Fortunately people are diving the world already, and sending them info about seahorses. Being done for other taxa already, but new for the ocean, few other projects focus on marine species. Best examples of citizen science are birds (eBird). Have done it so well, they have monitored conservation status of over 40K species. Challenge is addressing marine problems, wants to “give seahorses wings.”

Marine environment is extreme for monitoring, can’t get GPS, most electronics don’t work underwater in part due to pressure. SCUBA surveys overcome issue of location by tethering to floats on surface. Another problem is that a seahorse is easy to identify, but telling which species they are is very difficult. Wants to start a project that will be so sexy that every diver to give them data and feel they are making a difference for ocean stewardship. Critical because they won’t be able to enter data immediately. Maybe they make notes on dive record, but have to still want to enter it back at their hotel before getting into Coronas on the beach. Made progress here already! Trying to work with EpiCollect, someone else has offered to help with branding, but also needs protocol and building a toolkit for monitoring seahorse populations, train worldwide groups to assess these trends locally as partners. Needs help with feedback tools, beyond point maps and bar charts, like overlapping seahorse info with other marine data. If they can map where threats are, this helps communicate sense of urgency and conservation needs, and prioritize monitoring locations. Needs info on lessons learned and experiences. —–

Jones – iBats: using smartphones and citizen networks to globally monitor bats

Many indicators of recent declines in global biodiversity. Looking into smart monitoring, bats are a good indicator, a fifth of all animals, widespread and sensitive to global change (behaviors sensitive to temperature) and important ecosystem service providers. Cool animated radar graph of bat emergence in Texas, saving a third of the crop pesticide costs by eating up bugs. Bats also interesting because they emit radiation in the form of echolocation, using ultrasound to communicate and locate objects. Can sense bats based on this radiation leaking. They created database of bat acoustic biodiversity, wants to use to to classify bat species. If acoustic monitoring could be done with these identifiers, why do this and where? Combined index map showing areas where the current potential for using the tool is highest due to call similarity, e.g., very different calls in certain areas.

Have tested acoustic species classification tools, most call types can be identified at over 97% accuracy except one group of calls. So continental tools would be the ideal. Using ultrasonic microphone (very expensive, 400 GBP, trying to hack a cheaper version they think they can do for 10 GBP) plugged into smartphone headphone jack; need special microphone because high frequency sound requires high speed sampling. Have developed portals/versions in different locations and different languages. Started off in Romania (of course!) but effort has moved around the world.

From there, building distribution maps, using machine learning and creating hotspot maps to inform conservation policies. Cn do trendlines, to show bat populations as a headline indicator of ecological health. Latest project is also doing this for frogs and insects like crickets. Got Zooniverse funding for Bat Detectives so they’re currently working on noir branding of their project.

Trip Report: London Citizen Cyberscience Summit, Day Two

Notes from 17 February talks in London – for full details about speakers, see the full program at

More really exciting talks! Great diversity of projects represented, wildly variable technology sophistication, and fascinating people. Check out the #LCCS2 hashtag on Twitter to see the discussions.


Plug for EpiCollect – easy to configure interface for setting up obile-basedmonitoring and web-based result displays, non-technical project organizers can use it easily. —-

Igoe – NYU Tisch school of the arts – Keynote

Citizen science is a good term but sets up a divisionary dynamic. Those people you try to engage come with their own expertise.

Biggest challenge at ITP program is getting students to understand each others’ backgrounds. Lot of emphasis on making and hacking. Igoe does physical computing program, lots of students want to go beyond the computer in terms of interfaces. Trying to get them to start their thinking based on what people do physically, start with actions and not mapping to an existing controller. Lots of art projects, some projects are less functional or are just plain strange – “Circadian squirrel” – stuffed robotic squirrel that moves around and then removes its head, created by a former librarian, Jill. She also made cricket headphones, with little habitats for crickets that you wear like headphones. Another person made a radiation detector that is made of gongs, something of a Trojan horse that would get attention from people who otherwise wouldn’t notice.

Biologically generated materials – electrolysis for building up calcium carbonate as a sculpting material. People interested chemistry and biology then got into the aesthetics of the sculpture. They use a system called Processing to get people up and going with programming quickly, also a lot of Arduino that lets people build good instrumentation right away. Example of balance board to help stroke victims that students were able to build in 2 hours, programmed the visualization interface in 15 minutes, and the speed of development allows easily throwing away things that don’t work – you don’t get attached to the thing because of the investment in creating it, which means you do more and innovate faster.

Lasersaur – laser cutter that can be built from kit for $500. OpenOCR allows you to do DNA analysis for about $500. Gets people into doing recreational biology. Not all of the projects are functional like this, others allow people to explore their own expressiveness. Adaptive technologies that help people explore their everyday life, noticed the way that physical expression happens in wheelchair users, just as expressive as any other body language. Created a pair of ramps for a wheelchair so a person could DJ by wheelchair movement, allows him to use his skills to do what he wants. MD sufferer who loves MLB on PS3 but can no longer play his games since the disease has reduced his capacity, created a controller that works for his abilities so he can play again.

An occupational therapist who got tired of the paradigm of “give us specs, we’ll go away and build it for you” but devices were never what she wanted, so wanted to build it herself. Created a range of motion measurement tool that allows you to make music that gets better with greater range of movement. Patients improve faster because they focus on making music, not doing exercises. Interdisciplinary collaborations, students wanted more plants in their environments but were concerned that they would all die. Created plant moisture and humidity sensor that calls you when it’s sick, now they’re selling the kit which lets you DIY but you also get attached to a plant. Project Noah was a procrastination that has worked out really well. Loves monkeys, someone called him on his bluff about wanting to work with monkeys. Anthropologist who wanted to use motion technologies to track monkeys, so Igoe proposed a class on tracking monkeys, which actually was approved. So he ended up in the rainforest tracking monkeys, interesting interaction design challenge for students. Focus on what people do for their work and how to improve that, instead of teaching students about primatology, taught primatologists about technologies, but learning went both ways. Example of clunky telemetry antennas and old PDAs that they are stuck with and cannot replace. Students thought they could just move to Android, but there is no network. So one student put together a cell network using observation towers, almost up and running.

Primatologists do a lot of analysis with monkey poop that they bring back to NY for analysis. Problems with broken gel cones that cost $60 each, a student was able to use a laser cutter to create some for $10 each. Sometimes send students to zoo to watch monkeys, students will watch for 3-4 hours in the cold! Students got obsessed with observation protocols, applying it to game design. They play hide and seek with radio collars and discover that it’s not so easy to find people with radio collars, technology is crude and you have to learn a lot about radio to use them. Some things you can teach in theory, others you have to experience.

Competition to develop tool to measure monkeys without “taking the monkey down.” Winning device created by photographer and engineer, their job was running a fablab, so they had unlimited access to tools, tested it on stuffed animals, found that in the wild it worked so well that it was only .5 cm off. Students got bored with it and moved on, but left the plans for others to build on.

Another project funded by UNICEF, dealing with clean water issues. Water detection tools are expensive, hacked a mass spectrometer from Arduino. Not that expensive, students formed a company to market their tool and release it openly so others can make it as well. ITP promotes a lot of OSS and OSH. Students work to make things for nonprofits and researchers, it’s working out really well. Lion collar that warns farmers when lions are nearby so they can move their cows.

Key thoughts they transmit to students: art, science, engineering and design are all deeply personal, idea doesn’t matter until it’s used in real life. It’s worth being promiscuous, best ideas are those that grow, those that are hidden do not amount to much. The things we make are less important than the relationships we support.

Qs: are there things bridging gap between cit sci and art? Yes, more overlap than you think. Students ID most of the problems to address. What do you think the role that self-selection is? Classes are idiosyncratic, self-selection is valuable to making useful outcomes. Spend a lot of time with admissions to get people who will work well in this program. How to make it happen in other schools is being open to cross-disciplinary collaborations. Have you done any biohacking? Not yet, dying to. Have attended workshops, interesting to see what comes up. Postgrad adults in genspace workshops feel much freer about expressing themselves in class, about failing, because there’s no grade, they experiment more. Discussion about sample swabbing, aesthetics and patterns for doing bacterial cultures and what works best. What about PCR tools? Haven’t done that yet but planning to, people are finding results not the same using these tools. How do you get people to realize they have an interest? Answer is counterintuitive – it’s listening, finding out what people want and need before they create anything or even agree to try to do it, making sure it will work. Have a diverse group of people, how well received are you by academics? Reason he is here is works with Francois Grey, overall reaction has been good – people know what they are and what they’re not, they come to them looking for new ways of looking at things, not precision. They are in a performing arts school, and are hard core in that context. It’s about knowing what they are and are not and being clear about that. Recently started a summer camp, people want a starter course or alumni want to retool their skills after a few years out of school with all the technologies changing. Important because hackerspaces are replacing what they do, so role in university is less relevant when community is doing what they do, they can either fight it or work with it. So they work with it. How much do you expect on admissions? Just a bachelors degree with some exceptions, train them in all the basic skills they need. Try to admit a diverse group, not too many performance artists or engineers, make multiple passes on balancing the composition of the group.


Dosemagan – Public Laboratory for Open Technology and Science: open source development of tools for grassroots science

Low cost DIY environmental and health tools – science is too expensive, too much science is academic-oriented, lack of on the ground experience among researchers, lag in knowledge exchange, people on the ground don’t own the data, don’t understand the problems. They call it civic science (problematic due to other baggage.)

BP oil spill – mapping using cameras in 2L bottles on balloons and kits, mapped oil spill over time. The point of grassroots mapping is offering an alternative, let communities address their own problems and issues like a media blackout and inability to engage in the spill despite local impacts.

All work they do requires open source licensing, they maintain a public domain archive, have agreement with Google to show their maps on Google Earth with their data. Beyond archive, needed to take results to different formats to disseminate information. Printed the grassroots map and dropped off at marinas, seafood restaurants, gas stations, so people can pick up maps and do ground truthing. Initial project spread quite a lot, second project is use at Gowanus Canal Superfund site in Brooklyn and in other countries all over the world. In Brooklyn are able to see inflows and other problems at site, then can do ground truthing. Able to start developing new applications and techniques. Another is near infrared camera, based on NASA sensing, using to look at vegetative health in wetlands and other sites. Approach helps them reduce barriers to new efforts, they call these intensive hacking events barnraisings. Had camera hacking event in NC bringing together a broad team, now are able to use infrared imaging for their aerial monitoring. Able to show photosynthesis in urban and wetland areas.

Unexpected impacts – aerial mapping of protests in S America. Locals not only made a device but also instructions on how to build and how to get materials in Chile as well as costs. Doing livestreams with iPhones, showed that area of activity was much bigger than officials were saying. Has been adopted much more broadly, subverts corporatized view of space and place, it’s about a moment in time, not just a place. First used in US for Occupy events.

Low cost approaches to environmental health and toxics. Tracks are healthy homes, community science, sustainable futures. Healthy homes – Roomba-based indoor air quality mapping, taking long exposure films to see paths and activate test strips. Now working with formaldehyde sensors. Moving away from Roombas because they move too fast, so they now use hamster balls with $4 robots inside.

Thermal imaging – flashlight that uses long exposure film and customized flashlight to find heat loss areas so people can take action, immediately analyzable and usable. Same idea with formaldehyde, identify brands of carpet that release more of this chemical that is linked to asthma.

Community science – hydrogen sulfide sensing, neurotoxic gas that is developed by bacteria in gas well. Setting up bucket brigades for air sampling, normal cost is $500 for analysis and sample has to be returned to lab in 24 hours, hard to do in rural areas. Hydrogen sulfide tarnishes silver, so creating silver halide screens using photographic paper to sense the gases. Working closely with academics to lab standardize the test so it can speak to validity of the science.

Other projects looking at environmental estrogens, interested in developing DIY tests to examine water sources and aggregate information on environmental health threats. DIY spectrometer to ID broad range of chemicals, kit you can buy online. Also interested in supporting alternative agriculture and environmental remediation, infrared camera and low cost image analysis with “clashifier” to support remediation projects.

Focus is on community oriented, developed, and owned science to address local interests.


Zoological Society of London – Instant Wild – Hardware vs Hyenas

App that transmits photos from the wild to phones for instant analysis. Focus is promoting and achieving worldwide conservation for animals and habitats. Have used camera traps, requires manual retrieval and troubleshooting. New technology making that wireless, currently have app on iPhone, lets members of public identify animals in photos and lets them see new species during your commute. Branding based on an app. Only have 6-7 cameras out there, issues with hyenas. Sri Lanka sites, don’t release specific locations to prevent poaching and trapping. App allows people to debate IDs, have had 80K downloads in 3 months, over 320K IDs. 7K regular users. Species in Kenya – porcupines and elephants. Sri Lanka – porcupines and deer. Surprises – fishing cats, African leopards, owls. Mountain mouse deer, only first photographed 3 years ago, so they are finding and recording rare species live.

Next steps are increasing camera numbers, continue to develop next gen cameras and network, and protect cameras from damage: hyenas eat cameras! Even when in security cases, hyena teeth go through LCD screens. How does it work? Cameras with LED flash, GSM antenna, PIR sensor, lens, and security lock. Saves to SD card, target audience is commuters on the way to work, bored people who can spend time looking at data.

Question is how to transmit data from wild to central London? Tools include Arduino, Raspberry Pi, Digi. Current cameras involve ScoutGuard, UWay MMS, Reconyx – best trail camera ever made. Limitations are SIM card based cameras, limited by GSM coverage, not scalable and not enough control. Problem of saving to SD card, need to send photos without stopping camera operation. Looking at making SD multiplexers.

New projects include canopy measurement, air sensing, sound sensing like gunshots in forest or logging trucks where they don’t belong. Forest Hotspots for getting data out from tablets in the field. Using Zigbee, Xbee, Raspberry Pi, other technologies for transmitting via 2G, 3G, satellite. Usual approach is using a satellite dish which requires truck and power generator, they do data transmission at night when there is enough bandwidth. Also doing skychat between Wales and Africa sharing traditional dances from each culture. Way more talk about technical components and tools that I don’t know anything about!

New technologies are cheap, need ruggedized cameras so buying those, but can DIY the rest. Will be doing more releases with Instant Wild as their new technologies are put in place.


Loreto – EveryAware

Consortium with lots of EU members. Trying to address problem of organizing. Combining objective and subjective measurements while enhancing individual awareness that they hope will trigger change in individual behaviors and generate policy pressure. Themes including social computing, participatory sensing, geo location and other aspects.

Turning users into sensors, main difference is collecting the subjective as well as objective and linking the measured quantities with opinions, perceptions, impressions, personal experience. Question is whether access to these data change our understanding. Trying to understand how opinions emerge, shift, change in a population – complex systems plus opinion dynamics. They call it technosocial systems. New opportunities are understanding and controlling information dynamics, using web as a laboratory for social sciences, and raising awareness and participation.

EveryAware platform – sensor box with GPS, accelerometers, temperature, humidity, noise, air quality, Geiger. Subjective part is tags, annotation, votes, comments. Uses smartphone to send stuff back and forth to servers for instant feedback. Different sensor boxes and smartphones for different focus. Also interested in web-only experiments with online games, etc. Case studies around Europe. Game Theory based experiments. XTribe web platform for social computing and experiments, Goals include standardized laboratory for social sciences and basin of attraction for recruitment, wide range of potential research areas. Games like blindate, Guess Where – How do you perceive maps?, City Race about strategies for mobility given limited information, compare with Google routing. Nexicon, word association and coordination. —–

Ellie D’Hondt – Participatory Mapping

City air pollution issues, affects a lot of people, good for learning laboratories. Responsible for 70% of greenhouse gas emissions. High potential for volunteer sensing of environmental parameters, people already carry mobile phones with potential for application to sensing. Goal is implementing citizen observatories focusing on noise, microclimate and air pollution.

Main focus so far is noise, big problem in cities all over the world. Recent report from WHO that Europe loses 1M life years due to noise pollution, and it really gets under people’s skin. The problem is actual, representative, and possible. NoiseTube project with GPS smartphones, Internet connectivity, map server. App seems very similar to WideNoise, but samples every second and auto uploads if you have data plan, when you get home the tracks with noise measurements are ready to view, manual upload if no data plan. Use is mostly in Europe but being used worldwide by uncoordinated individual users. Projects starting up without their knowledge, using the platform for primary education in Lyons. Another application is coordinated grassroots campaigns. Coordinated is when people want good data, community organizing groups, e.g., in Antwerp where port leads to lots of noise. Rigor is important if policy is goal, you need something realistic that will convince authorities. Codesigned citizen science experiment to address concerns. Lots of noise mapping going on, but it isn’t sufficient. Worldwide issue, lots of European efforts. Large cities required to generate noise maps every 5 years. Put the info into propagation models to fill in gaps, e.g., with building canyons that retain/echo noise between buildings. Health norms say 50db during day under 40db at night, levels well in excess, and WHO norms just can’t be achieved.


Cavalier – Being a Citizen Scientist

Goals are raising interest and understanding of science, grow the ranks of citizen scientists, and encourage citizen involvement in research projects and policy discussions. Learned about citizen science at UPenn and focused on those goals in capstone project. Loss of Office of Technology Assessment problematic in terms of loss of public feedback on science policy.

Too much prior sense that public is dumb and science is weary. Adult scientific literacy is very high in US, only second to Sweden, but youth science literacy is very low. Ideas around public engagement.

Citizen science yields serious science. Initially it focused mostly on birders, also water quality monitors. Measurement calibration not the big issue, it’s recruitment. Not easy for would-be contributors to find a project. Started SciStarter to connect people to citizen science, considered one of Philly’s top 10 tech startups. Simple site, easy to use, focuses on what people want to do. What can I do at the beach or on a hike? They don’t create projects, they aggregate info, and they do heavy editing and approval to ensure quality. They then market the projects.

Seems to be working! Mastodon Matrix Project saw doubling participation and that can be traced to SciStarter. SnowTweets saw 3x participation during the month when they were promoted. National media partnerships help, NBC News, Discover Magazine, riding coattails. New animated widget, Mission Impossible theming for project of the week on Discover Magazine which has 2M online readers. Recruitment services are free, also geographically-centric. Motivations to act – advancing research – you’re asking someone to give up something to do your project. What would make you stop what you’re doing and give up family time to participate? Examples of Firefly tracking, Belly Button Biology, BioCurious, mapping AEDs will provide information for emergency responders. Other motivations – civic concerns, money.

Cheerleaders – NFL and NBA cheerleaders, using pop culture icons to promote science. All are professional scientists, they only get paid $35 game, so telling little girls that they need another career plan. Wide variety of project resources, part of the goal of SciStarter is evening out the playing field by showing everything in the same format. Important not to put out too little or too much information. Reinventing the wheel – happening too much. Rick Bonney a trailblazer in nature-based projects but also more broadly especially with respect to design and evaluation of citizen science. She is trying to advocate more policy involvement, e.g., office of technology assessment, often has participatory aspects. What came from her writing on that is a new network that is pilot testing technology assessment focused on biodiversity. Trying to establish what an OTA might look like – ECAST, museums, academics, and others in partnership. Policy issues is part of why OTA shut down. When there is direct deliberation and conversation, there is more similarity between political parties than people think. Metrics of interest? Haven’t gotten as far as it could – spikes with national media partners cover a project, so that is why partnerships with media are so important.


Riverfly Partnership

UK-based project drawing on bottom-up enthusiasm. Water quality monitoring for rivers, network of anglers, conservationists, entomologists, scientists, water course managers, relevant statutory bodies. Launched in 2007, concerned with protects river water quality.

Using volunteer network as a trigger for statutory intervention, which rigorously monitors 3K sites every 3 years, less rigorous monitoring protocol but good enough to identify critical issues. Issues with local groups hoarding data, need to consolidate what they’re doing, but there is a local difference – 3 localized court cases where volunteers identified breaches in policy where businesses were depleting river resources. Strong and effective policy link, very bottom-up origination. Need a lot of help with technologies, central online facility, smartphone app, online validation, auto analysis and reporting, etc. Huge needs but little resources, major gap in technical expertise, but they know what they need from a functional requirements perspective.


Wilson & Cundy – CERN@school, Langton Star Centre

Focus is ionizing radiation, public has poor understanding of this and don’t understand the science, see it as more dangerous than it is. Langton Ultimate Cosmic ray Intensity Detector – LUCID – made for LHC and other particle accelerators, but can they be used for other purposes? They produce images, not just beeps like Geiger counters. Competition for space project to detect cosmic radiation, school kids used Timepix sensors to create winning project that will be sent up on satellite. Now applying for other purposes to visualize radiation. Medipix chips family of hybrid pixel detectors developed by CERN, see what particles you’re looking at from visualization, helps students understand what radiation is much better than beeps.

Data from LUCID and CERN@school will be uploaded to GridPP (particle physics) to be made available for students to analyze. Cool collaborative product lets them learn about not only physics but also the computer science required to support this research. Architecture and requirements for design.


Dumitriu – Normal Flora Project: Bacteria and Bioart

Artist who works in unusual media, in between science and community. Long term project focused on unseen, unnoticed, ubiquitous bacteria, yeasts, and mold around us. Fascinated by bacteria, more bacterial cells in our bodies than body cells, more bacteria on your fingertips than people in the world. Focus is “how sublime is your ecosystem” not “how clean is your house” – these organisms are integral to our lives, we’d die without them.

Crochet installation replicating bedroom bacteria, made collaboratively. Needlepointed onto a chair. Shows these images at hospitals as outreach and education, working with craft techniques makes topic approachable to little old ladies. Bacillus mycoides has a beautiful structure, looks like lace, found the bacterium cultured from different locations have different colors. Also does performance intervention art, microbiological, at Brighton Fringe Festival. Implanted agar into ground, which pulled up soil bacteria, then they had discussion around the way that bacteria communicate chemically. Kryolab collaboration with arctic bacteria center in Finland, was able to exihibit her strains of new arctic bacteria in the gallery, had to get a certification that they are not dangerous to bring them to UK for gallery installation. Had to ship arctic ice to gallery to give bacteria appropriate habitat, opened up conversation about climate change. Installations about bacteria scale free networks, cybernetic bacteria, infective textiles. Open lab at a lighthouse with homemade agar and culturing own bacteria, works with safe protocol for doing this, worked with microbiologist to develop that. Stains them and used them to make dress decorations. Embedded images of bacteria communicating in garments, because some bacteria change colors when it communicates. Staining period pieces to reflect on gentleman scientists.

MRSA quilt experiments, based on idea of bacterium from her nose but wasn’t carrying MRSA. Textile pieces were inoculated with MRSA bacteria and then cultured them to grow the blue bacterium on the textiles, using turmeric as an antibiotic to prevent culturing in some parts of the textiles. Project to let people make own MRSA quilt pieces, then culture them.

Working on a BSL 2 lab that is gallery-safe to cultivate pathogens to use in art, funded by Wellcome Trust and working with microbiologists. Was allowed into a secret lab and got to handle category 3 organisms, comfortable working with level 2. Has been recommended to become a registered microbiologist so she can do the work independently.


Paulos – Hybrid assemblages, environments, and happenings For participatory culture Starts from Operation Moonwatch in 1956, thinking about technologies and human experience related to science. Star 2008ha, found by 14yo astronomy enthusiast – had access to more equipment than average person. But most of us have phones that are really little supercomputers with sensors, etc.

Who participates in making visionary science happen? New challenges at this scale – environment, famine, healthcare, literacy, economy? Different strategies for addressing each of these through participatory projects. Concept of microvolunteerism, we know our neighborhoods best – quote from Elinor Ostrom about citizens having the right information more than bureaucrats do. Connections to DIY community, manifesto of open disruption and participation. Value of helping people be curious about our world and explore it in new ways. Innovation companies don’t care about SATs and skills, they want to know if you can brainstorm all the possible uses of bubble wrap. Need to rethink education, bridge laboratory and field site views to support health of cities. Interesting trend in the rise of the expert amateurs, not just citizen science, moving from proprietary innovation to populist innovation. Scientists must abandon their white lab coats, including the invisible ones they wear in their heads.

Living environments lab, variety of projects. Citizen science is not just valuable for science, but also kindling curiosity and sense of wonder. Value for literacy, data, grassroots participation, awareness. Using mobile technologies for measuring air quality and water quality, found that people returning monitoring devices said that they changed their behavior based on their awareness of air quality, seeing data changed the way they saw things around them. Once you expose people to new info, they change behavior. Opening the landscape beyond personal sensing – mobile infrastructure, indoor fixed air quality, placing them in public to see what people did. Sensors on street sweepers, cover whole city very rapidly, good for data over time. Gave sensors to community, people had different strategies for installing them, also got calls from the police because to them anything that is a technology but not a cell phone is a bomb. Also trying to drive cost down for sensors to give them to 100Ks of people, critiquing sensors themselves, other ways of interpreting data, e.g., instead of shortest route, cleanest air route. Shirt that shows data about air quality, breathe/don’t breathe sign, ticker about health value. And spectacle computing, small sensors on balloons that glow based on particles and gases, sometimes you don’t want people to miss the computing, you want them to notice and participate and spread the word.

Micro volunteerism – 42 seconds at an intersection, what can I do? Developing own kinds of platforms, stuff like EpiCollect and ODK. Develop campaign-based efforts to investigate and manage projects from bottom up.