Crowd-sourced wisdom

Here’s something quite fun to learn about. The ABC recently did a piece on crowd-sourced wisdom with an outcome that might not be surprising for some mathematicians but would surprise most of us. It is a fact that ‘the wisdom of the crowd’ is often just as accurate as rigorous scientific investigation. Whether we need more precisions depends on the purpose of the question being asked but it is relevant to our program, where we need participants to add a number that represents the collective view about the a ‘strength of connection’ between the ecosystem and community values.

Don’t overthink the answer

The scientifically-minded among us tend to worry far too much about whether what we say is accurate. We fret about what it means and how we go about it. Then we might dwell on the method for a long time, rather than letting our instinctive wisdom take over.

There is a fabulous team-building experiment called the marshmallow challenge. Kindergarten kids often beat CEOs when it comes to building a bridge from spaghetti and marshmallows. By the time the CEOs have worked out the way to start, the kids have completed the bridge through sheer iteration and experimentation. It’s not pretty but it does the job. It looks and behaves like a bridge. Overthinking can lead to delays and is an obstacle to getting decisions made.

When tasked with building bridges from marshmallows and spaghetti, kindergarten kids often beat CEOs.

The value of conversations

When we’re in teams sharing knowledge, the groups that quickly discuss how strongly they feel, will most likely reach a consensus faster than the groups who are intellectually challenged (by that I mean, our intellect becomes the obstacle to overcome because we worry about getting the answer wrong). The reality is, that there is no perfectly right or wrong answer but ask any group of people and after some discussion, they’ll often settle on a similar outcome. This is the wisdom of the crowd.

The wisdom of the crowd

In this wonderful experiment, the ABC asked a group of kids to estimate the above-ground weight of Uluru. Then they asked a researcher, who produced a 3D model and extrapolated its weight based on known density of the rock. Even though some of the kids’ estimates were wildly inaccurate, their average ended up just 15% different to the scientist’s estimate. So close in fact, that it’s possible the crowd-sourced result might be the most accurate … since the scientific estimate would have been plus or minus a margin of error and no-one can ever know the precise weight anyway.

It matters how you’re going to use the answer

What does this prove? Well, if we’re looking for a quick answer to a simple question such as ‘what strength connects how people feel to an ecosystem feature’, then asking a few people to come up with an average, is likely to be accurate enough.

As we discovered earlier, the patterns in the knowledge data don’t have to be that precise to be a sufficient basis for important decisions. It depends on the context of why we need the information.

In the case of the marshmallow challenge, if it looks like a bridge, and behaves like a bridge, then we can tell its a bridge. If we know something is a bridge we can make some sensible decisions about where to cross a river. Later on we might want engineers to build a stronger bridge, sure, but right now a bridge is a bridge.

Why is community knowledge critically important?

We scientists tend to can have an awkward or skewed idea of the context of some questions. We usually can’t agree on anything quick enough. Plus, we know that asking scientists to input cultural ecosystem services data in Restore the Bay is not the right approach to start with. For this, we have to talk to you!

Community members who don’t overthink answers have a better degree of judgement naturally. Scientists don’t always like to hear this but it’s also an important scientific fact and why we cannot, as scientists, presume to know what a community needs or how it behaves. Neither can we expect to be able to make timely and informed decisions by waiting to investigate every component of a complex system.

The number of cultural connections in a system like this is in the tens of thousands. The good news is, we can populate those data accurately and quite quickly using community knowledge.

By mid-year, we will have assessed 34 ecosystem features x 44 services x 42 values = 62,832 links in the decision tool. Imagine what would happen if we tried to do a scientific investigation on every one of these? It’s not only unachievable but entirely unnecessary – but only if we use community knowledge.

This is one vitally important reason why doing the groundwork and enabling community co-design is so important before we start making decisions.

Previous
Previous

Last Stage 1 Workshop & Your Personal Productivity Defined

Next
Next

Meet Brigita, Artist