In a single weekend I had a trio of experiences that were shocking in just how exceedingly normal "future" technology has become normalized.
The first was when I was having lunch with my wife and a friend of hers from out of town. This friend, an admitted 'Mommy Blogger', mentioned, unprompted, just how much she missed Google Reader. After being discontinued by Google earlier this year she had yet to find a suitable replacement and, as a result, felt like she wasn't as up to date with what her peers were up to. The common narrative around Google Reader users was that these were a vocal, yet small, phalanx of power users. Google had sun-setted the popular blog aggregation tool because it wasn't growing. My wife's friend is an intelligent, bright person currently launching her own micro-business. But is she a TechCrunch-loving, hard-core web junkie, as we were lead to believe the typical Google Reader was? Hardly.
The next day I woke and made my way to the kitchen. Over breakfast my wife and her Mother was discussing what a hashtag was and its correct usage on various social media platforms. Most online denizens know Chris Messina launched that boat back in 2007. However, to hear the discussion was surreal.
"Oh, I thought she was using it was to show a break in the text, to put extra emphasis on each word."
"No, it means you're referring to a larger conversation that others are having. Most apps will automatically link the word with the hashtag to search results for that item. It's away of connecting what is currently being said with what others are also saying."
The final moment of future shock was when coordinating carpools to perform errands. It wasn't just the burst of address swiping whenever an address was mentioned. It was the spontaneous discussion of how badly Siri sucked for navigation and the hacks people had come up with to circumvent potentially spurious turn-by-turn 'help'.
I have been blessed with a preponderous number of smart people in my life. Being the bright people that they are, it shouldn't be a surprise that they're not only using the software I'm contributing to, but actively changing it to fit their own ends. To think of them as only "consumers" implies a dangerous, hierarchical relationship. It divides the world into two distinct groups: the silly users, who either "get" things (or not) and the 'technorati' who sit in judgement.
The term “user” made its appearance in computing at the dawn of shared terminals (multiple people sharing time slices of one computing resource). It was solidified in hacker culture as a person who wasn't technical or creative, someone who just used resources and wasn't able to make or produce anything (often called a “luser”). And finally, it was made concrete by Internet companies whose business models depended on two discrete classes of usage, a paying customer (often purchasing ads) and a non-paying consumer (subsidized by viewing the ads). Along the way only a few criticized the term, calling it abstract at best, and derogatory at worst.
It's time for our industry and discipline to reconsider the word “user.” We speak about “user-centric design”, “user benefit”, “user experience”, “active users”, and even “usernames.” While the intent is to consider people first, the result is a massive abstraction away from real problems people feel on a daily basis. An abstraction away from simply building something you would love to see in the world, and the hope that others desire the same.
Software does not exist in a vacuum. It is operated by people with decades of varied and unique experiences. This results in outcomes not considered in our insular VentureBeat or The Verge echo-chambers.
We all live in the future now. Being in touch with a larger community is humbling; we are no longer descendant from on-high decreeing new realities (if we ever were). As writers of software, we need to approach life with as eyes and ears open. Software shouldn't be a statement about us, the developers. It should be more about what it allows people to say about themselves.