Countless computers worldwide automatically fingerprint our use of social media around the clock without our knowledge or consent. So we’re left with the following choice: stay digital and face the Eye of Sauron, or excommunicate ourselves from social media and face digital isolation from society. I’d chose the latter were it not for the life-saving role that social media can play during disasters. So what if there were a third way? An alternative that enabled us to use social media without being fed to the machines. Imagine if the choice were ours. My PopRock Fellows (PopTech & Rockefeller Foundation) and I are pondering this question within the context of ethical community-driven resilience in the era Big Data.
One result of this pondering is the notion of #noshare or #ns hashtag. We propose using this hashtag on anything that we don’t want sensed and turned into fodder for the machines. This could include Facebook updates, tweets, emails, SMS, post cards, cars, buildings and even our physical selves. Buildings, for example, are increasingly captured by cameras on orbiting satellites and also by high-resolution cameras fixed to cars used for Google Streetview.
The #noshare hashtag is a humble attempt at regaining some agency over the machines—and yes the corporations and governments using said machines. To this end, #noshare is a social hack that seeks to make a public statement and establish a new norm: the right to be social without being sensed or exploited without our knowledge or consent. While traditional privacy may be dead, most of us know the difference between right and wrong. This may foster positive social pressure to respect the use of #noshare.
Think of #ns hashtag as drawing a line in the sand. When you post a public tweet and want that tweet to serve the single purpose of read-only by humans, then add #noshare. This tag simply signals the public sphere that your tweet is for human consumption only, and not to be used by machines; not for download, retweet, copying, analysis, sensing, modeling or prediction. Your use of #noshare regardless of the medium represents your public vote for trust & privacy; a vote for tuning this hashtag into a widespread social norm.
Of course, this #noshare norm is not enforceable in a traditional sense. This means that one could search for, collect and analyze all tweets with the #noshare or #ns hashtag. We’re well aware of this “Barbara Streisand effect” and there’s nothing we can do about it just yet. But the point is to draw a normative line in the sand, to create a public and social norm that provokes strong public disapproval when people violate the #ns principle. What if this could become a social norm? What if positive social pressure could make it unacceptable to violate this norm? Could this create a deterrence effect?
Either way, the line between right and wrong would be rendered publicly explicit. There would thus be no excuse: any analysis, sensing, copying, etc., of #ns tweets would be the result of a human decision to willingly violate the public norm. This social hack would make it very easy for corporations and governments to command their data mining algorithms to ignore all our digital fingerprints that use the #ns hashtag. Crossing the #noshare line would thus provide basis for social action against the owners of the machines in question. Social pressure is favorable to norm creation. Could #ns eventually become part of a Creative Commons type license?
Obviously, #ns tagged content does not mean that content should not be made public. Contented tagged with #ns is meant to be public, but only for the human public and not for computers to store and analyze. The point is simple: we want the option of being our public digital selves without being mined, sensed and analyzed by machines without our knowledge and consent. In sum, #noshare is an awareness raising initiative that seeks to educate the public about our increasingly sensed environment. Indeed, Big Data = Big Sensing.
We suggest that #ns may return a sense of moral control to individuals, a sense of trust and local agency. These are important elements for social capital and resilience, for ethical, community-driven resilience. If this norm gains traction, we may be able to code this norm into social media platforms. In sum, sensing is not bad; sensing of social media during disasters can save lives. But the decision of whether or not to be sensed should be the decision of the individual.
My PopRock Fellows and I are looking for feedback on this proposal. We’re aware of some of the pitfalls, but are we missing anything? Are there ways to strengthen this campaign? Please let us know in the comments section below. Thank you!
Acknowledgements: Many thanks to PopRock Fellows Gustavo, Amy, Kate, Claudia and Jer for their valuable feedback on earlier versions of this post.
Good idea – but assuming social media adopts this strategy, why then not implement an opt-in instead of an opt-out-mechanism – in other words: make the #noshare the default and add a #share when content is considered public?
Thanks Matthias, yes agreed, perhaps we could start with opt-in to make the case and then implement an opt-out policy.
Mathias – I think the idea is to put up a bit of a moral road block in the way of using data tagged with #noshare
If it’s opt in – ie. people have to tag #share, I suspect it’ll be far too easy for people using feeds to say ‘they meant to share – they just don’t know the hash tag’!
Whereas using #noshare makes the line pretty clear – use these tweets/messages/images and you’re going against the express intent of the person from whom the data came from.
Obviously the best thing would be for Twitter, etc. to implement a change on their back-end that would mean that #ns tweets weren’t available through their API. But I feel this is a few steps down the road.
Only problem with opt-in is depending on any and all companies/ computers (read:Government) to embed it in their code. I’m not so sure in today’s world I’d trust that it would be done.
While I agree that enforceability is a potential issue (especially considering how far out of the privacy barn the Twitter data scraping horse has run) there’s precedent here. Robots.txt is pretty widely agreed to and honored, despite it being nothing but a norm that’s been agreed to and codified in most major search engines. It’s a thought-provoking idea.
Very interesting, thanks for sharing, Matt!
Pingback: #NoShare: A Personal Twist on Data Privacy | iRevolution | You & The World
Pingback: Seven Principles for Big Data and Resilience Projects | iRevolution
This is similar to that thing on webpages that prevents search engines from including the page in search results (no-robots? I forgot what it’s called.). The thing is, you would have to make Twitter agree to this — they own the platform anyway, unlike HTML which no one owned (but then it’s a standard that everyone — browser makers and search engines included — had to agree to). Any developments on that front? After that, what’s left for us to do is to do a massive informational campaign, maybe get a celebrity or a TV/news network to use it or showcase it. #thinkingoutloud
Pingback: Weekly Roundup of Awesome Links: Week of September 16th 2013
What about sensing that leads to a greater good? Alerting people in the path of a storm, or who would be impacted by a certain law or who have kids in the school district…
Thanks for reading and commenting, Kim. Yes, we’ve been thinking along similar lines; the notion of “Data Philanthropy” at the individual level for disaster response purposes. Perhaps something along these lines:
Pingback: #noshare – There is but a dream…. | B.A.T. '66
Pingback: The Best of iRevolution in 2013 | iRevolution
Pingback: Spatial Big Data, Impact, Sharing and Ethics – @Glocal Data