J 611 Week 8 Journal

The overwhelming sense I got from this week’s readings was a rising tide. Humans never got a chance to figure out how to handle one another online and in digital spaces. Now we’re getting AI that can produce in a day everything every human online has ever created.

This isn’t far from what we learned last week about beauty companies’ responsibility when marketing abroad and Vietnam’s responsibility while rapidly expanding its high-tech employment and production. To wit: where’s the responsibility? Where’s the brake? Oversight is never going to happen, so how are we going to teach responsibility?

Who is going to take responsibility for any of this?

Modern Communication: Now With Even More Noise

The internet is a marvelous thing: for the first time in human history, everyone is standing in the same room and can speak with just about anyone at just about any time. Unfortunately, we’re all yelling at the tops of our lungs simultaneously, and that makes hearing anyone damn near impossible. It looks like AI writers are making that even worse.

In radio theory, there’s signal — which is the actual message-carrying radio wave — and there’s noise — the turbulence, interference or static that interferes with the broadcast. When a broadcast gets modulated into radio, the de-modulation has to know what’s signal and what’s noise, and sometimes it’s really hard to tell the two apart, and that’s where choppy, fading or unclear radio comes in.

In modern communication the principle’s the same. The signal is worthwhile things to know. The noise is so-called “fake news,” crap advertisements, pop-ups, spam, poorly written emails, Facebook posts from bot accounts, Facebook posts from dumb friends — you get the idea. We’ve got news-writing bots and algorithms learning what’s prevalent on the Internet and just duplicating it.

Ironically, maybe the solution comes from different bots.

For Good or Ill: Bots Might Know Best

A good journalist challenges his source when he suspects duplicity. When gathering data about anything of note, a good journalist tries to detect falsehoods or accidental inaccuracies. Before running the story, the journalist should factcheck everything they’ve got, and make sure nothing fell through the cracks. The worst news and interviewers are people who don’t push back on their subjects and take everything they hear and think they know at face value.

Maybe robots are the way forward, then. They can’t take anything for granted (that they’re not programmed to). When a robot can “filter out 99 percent of false news stories,” maybe there’s hope.

[A researcher] cites the earthquakes that struck Kumamoto in southwestern Japan in April 2016. Soon after, an image circulated on social media of a lion that had reportedly escaped from a local zoo and was roaming the city. But [machine-learning AI news gathering service] Fast Alert realized the image originated in South Africa.

This example struck me particularly because I spend a lot of time on Reddit, where reposts are frequent. The idea of a bot doing the job of Know Your Meme in a heartbeat is exciting and heartening, and could reduce the number of crap posts we see on Facebook with clearly re-used footage referencing it as if it happened the other day.

But that means it’s a race to the top — or bottom depending on your perspective.

The Arms Race Begins: Deepfake vs Deepfake Detectors

There was a time when a rock was really effective at killing someone. So people wised up: looking out for people with rocks.

Then throwing them started working. So people wised up: stay far enough away.

Then people made spears. So people made shields.

Then people made swords. So people made armor.

Then people got on top of horses and did the stuff you see at Medieval Times. So people made big stone houses and walls.

You get the idea. It all progresses to telephone pole-sized tungsten rods launched from Earth’s orbit causing more damage than hundreds of nukes (“Rods from God” — I’m not joking). My point is that humans are good at spiraling out of control when the thing they’re building is countered by a one-up technology.

It looks like we’re headed that way with deepfake videos.

Researchers publishing their ability to detect deepfake raises a challenging question: how secretive should the process of chasing down deepfake be? National security intelligence gathering is secretive not because they’re doing shady stuff (mostly), but because once a person knows how you know what you know, they stop doing the thing that let you find out.

If I tell everyone I know a secret of yours because I listened in while you called your mom from your back patio and I was hiding in your trash cans, you’ll stop calling her from your back patio and move your trash cans.

Similarly, how should we tackle deepfake? Do we fund a team to clandestinely surveil the internet for such videos and grant them authority to take them down? Or do we insist they regularly publish the research and technology they’re using to identify such videos, thus granting the ne’er-do-well’s greater ability next time?

But Don’t Worry: Buy Your Cares Away

Meanwhile, the implications of AI, machine learning, megalithic corporations with detailed accounts of every decision you’ve ever made online, and a globally interconnected data infrastructure and economy are being put to good use: figuring out to sell you more stuff.

Humans are predictable beasts, and no one knows that better than companies that sell things online. Voice and voice-recognition are an obvious next step to selling us things and getting us further plugged-in to our Internet of Things. The more we speak with, anthropomorphize and associate with our objects, the more in-tune we’ll feel with them. And the more likely we’ll be to trust their purchasing recommendations — but also the less likely we’ll be to get rid of the devices themselves.

Humans are social creatures, and language is very important to us. If we can speak with our technology, we’ll feel even more connected to it than we already do. And speaking is easy and thoughtless, which is exactly where every major seller of goods wants to be: omnipresent and uninvasive. If we can buy something by simply musing aloud about how nice it would be to have it, you know Amazon is desperate to get to that point.

 

Some people talk about oversight, and some people talk about limits. But let’s be realistic: the genie’s out of the bottle. Responsibility is what we need, and no one’s talking about that. The future will be about coping with these changes and adapting to them, not about living happily with the wise and responsible choices we made today.

J611 Week 7 Journal

We Haven’t Arrived at the End of History

It’s easy to forget that right now only matters because we’re the ones noticing it. Relative to literally everything else, it’s not actually that important.

Believe it or not, we’re not at the start of history or the end of it — we’re not even the most important moment in it. This is just a brief moment, and it’ll pass.

It’s easy to forget that. Not least when we’re reading the increasingly shrill and strident warnings that are sounding all around us. And that’s not to say the klaxons shouldn’t be sounding! We have real problems, and we need to fix them. But we’ll come through this period of time. And once we do, we’ll still have a world to deal with on the other side. And that world is going to be shaped by our decisions on this side.

Facebook and Twitter are the Platforms of the World

I don’t like Alex Jones. In fact, I think it’s safe to say I hate him — an odious emotion if there ever was one. He’s a deceitful, harmful, profiteering, unethical hack and no one should listen to him — ever.

That said, I really don’t know how I feel about setting a precedent of muting him. Together, Facebook, Twitter and Google reach billions — which if you think about it on a global, historic scale is absolutely bonkers. Facebook, Twitter and Google control access to information for — probably — most of the globe and now they’re going to become the gatekeepers.

Do we feel good about them telling us who can and can’t participate on that stage?

I mean, I guess they already were, but there was an effort at objectivity driven by algorithms before. Now it’s as if they’re saying, “We’re a profit-driven corporation with no public representation or civic oversight, and we’re going to control who may and who may not speak to the world.”

That’s not great.

Haters Gonna Hate

Not only am I wary of a handful of corporations instructing me who will and will not be available to listen to, but Alex Jones and his followers will just go somewhere else. How are we going to engage with awful, hate-spewing liars? Do we just repress them and send them underground? Do we want the dark corners of the Internet to fill up more and more so that our brightly-lit sections appear cleaner?

Oh I Know: The Government Should Fix It

Maybe India has the answer. Could it be wisest to have the government write an instruction on what is and is not “fake”? When has it ever been problematic for a government to instruct its people on what they may know and what they can’t?

The death toll in India of journalists is horrifying. And it’s not particularly reassuring to read that corrupt, weak democracies are the worst places to be a journalist. Where are they needed most?

What are Journalists Going to Do?

Meanwhile, journalists and newspapers are struggling with modernity and structuring themselves to be resilient in the face of changing economics. What is trust when media is involved? Do self-reported stats about people’s lack of trust in news source matter when it’s clear they’re also consuming them?

Weeding out cause in the crisis of confidence in news media is crucial. If survey respondents say that transparency is a key issue for them, does that necessarily guarantee that being transparent will guarantee their trust? (That’s a rhetorical question: the answer is no.)

I don’t have a great transition, but take the 3:30 and watch this clip from the movie The American President. If you haven’t seen it, you’re going to miss some context, but the bottom line is that the President (Michael Douglas) is in political dire straits. The real reason for watching is the last dialogue.

J611 Week 6 Journal

The costs of full digitization are myriad and legion. There are barriers to getting everyone in the world full internet access that are macroscopic, applying to economies of scale. And there are barriers that are personal and closer, like when companies fail to embrace their social and global responsibility.

This week had articles from a gamut of publications that each touched on the cost of internet. “The digital divide” used to be an abstract idea, dealing with the people who knew and understood the social structure of the internet, and those who didn’t. Now it’s taken on a new meaning: the people, cultures and countries that have access to high-speed internet are more likely to have higher education rates and better economies than those who don’t.

But that’s also not to say that internet is a panacea. The Arab Spring failed, and Donald Trump regularly manipulates public opinion and understanding of fundamental truths from his position in the White House. In certain times and places, the Internet went from a democratizing force to a mob enabler.

The slippery, comfortable, invisible slope that got us here

Above all the others, my favorite piece this week was David Auerbach’s ‘How Facebook Has Flattened Human Communication‘. Succinctly, Auerbach illustrates key principles in data management, big data, capitalism, human nature and nuance of language.

I can’t think of where I’ve seen others writing or talking about it, but Auerbach’s ideas don’t feel never-before-heard; they’re a part of the zeitgest. Everyone knows that this is exactly what the “like” button was for, and on the heels of Cambridge Analytica, people knew that the 4 Horsemen of the Datapocalypse were harvesting our data and doing everything they could to keep us online. But no one was putting it down in so few words with such a clear line drawn for us to see.

For those of us who have spent time in the data world, it’s obvious that structured, quantifiable and explicit data is inherently more valuable and usable for a machine. And for those in the online business world, it’s obvious that keeping people’s eyeballs on your product as much as possible is first, last and everything in-between. And for those of us in the social psychology world, it’s obvious that people are driven by competing urges of complexity and simplicity.

But it took Auerbach to put it all together for us — well, me at least. There’s a clear line that runs through the competing-yet-cooperative impulses that drive people to use social media. Forces like nostalgia, loneliness, cynicism, narcissism and vanity drive us all to use social media and hate it too (a la this piece explaining why “If Facebook demonstrates that everyone is boring and Twitter proves that everyone is awful, Instagram makes you worry that everyone is perfect – except you.“). But those same forces are also universal experiences that Facebook’s machine brains need to be told about — and we do.

I think Auerbach’s piece is brilliant, and telling. Every sentence deserves a chapter, and I’ll be buying his book.

Also: Facebook and VR

My boy Ben Thompson at stratchery.com wrote a piece on Facebook’s dive into VR. The punchline is simply that Facebook really, really wants to be a platform, and not a service where ads get sold. Facebook (and Zuck) wants so badly for people to live their lives through Facebook, not just interact with it when they’re taking a crap and waiting for the bus.

Thompson thinks that’s a misunderstanding (internally) of their own place in the digital hierarchy. He makes a case that Facebook’s purchase of Oculus is representative of their misunderstanding of their role (what a surprise, given their reaction to social engineering and data manipulation charges) and will ultimately be viewed as a mistake.

J611 Week 4 Journal

Let’s Outlaw Advertising and Other Lofty Goals

Ramsi Woodcock at the University of Kentucky suggests that a core problem with television and mass media is that advertising represents a monopolistic hold on competition and effectively strangles free speech of companies that don’t provide sugary death water to 60% of the planet.

His solution seems to be to lobby the FTC to ban mass advertising, or at least have the FTC sue certain corporations and advertisers on anti-trust grounds. I hope I’ll eat my words some day, but I think it’s safe to say that the man responsible for trolling the Internet for the last two years isn’t going to suddenly decide that advertising isn’t a protected form of speech.

Woodcock’s suggestions might drive the conversation forward, but they’re certainly not very plausible. Maybe by getting enough people agitated about the effects of massive corporations sucking all the air out of the competition-room we can start meaningful conversations about reform, but I strongly suspect this ship has sailed.

“We Could, So We Did,” said every massive corporation fooling around with our Personal Information

If Woodcock’s piece was a sobering realization of our impotence in the face of massive corporations’ interests, then Lucia Moses’ piece about USA Today, ESPN and NYT targeting ads to users based on their mood as determined by artificial intelligence was a swift kick in the nethers.

Reading ESPN’s Senior VP Vikram Somaya say, “We have a lot of first-party data we can use with a lot of transactions with the consumer; why wouldn’t we do it?” really inspires confidence in a world recently stunned by Cambridge Analytica leaks, regular security breaches and — really, anyone who’s ever watched any episode of Black Mirror.

Our personal information is the most valuable thing average citizens bring to the internet today, and massive online entities are able to acquire it, monetize it and profit from it without our input or say-so. On top of that, they’re using deep learning to identify and respond to our emotions as we use their product — who’s to say they’re not going to try to manipulate our emotions too?

“Publishers and agencies say they draw the line at using personally identifiable data.” Oh, yeah, that’s fine — no oversight needed there!

Jeff Bezos Saves WaPo and Sets the Example

This week’s star baker was Dan Kennedy. His piece exploring The Bezos Effect occasionally runs a bit close to being a paean to the world’s wealthiest human, but on the level does a great job of investigating and analyzing the successes of acquiring a massive news entity, and trying to get it productive in this century.

Aircraft carriers famously take several miles to turn when underway. Having never worked at a the WaPo, I can only imagine that it is similarly burdened with internal, external and institutional momentum that makes revolutionizing, modernizing and embracing the fact that the Internet is — in fact — a different thing from television and radio pretty hard.

Reading about Bezos’ philosophies and plans to convert the newspaper into an internet-facing platform was exciting to say the least. Keeping on the editorial and technological heads helped to keep internal momentum, while flooding the scene with profit-tied money clearly worked something to get the up to speed.

Just Whose Responsibility is Personal Data Online?

My boy over at Stratchery didn’t post a new blogpost before the deadline for this submission, so I went backwards to his post about data factories online.

This blogpost from Ben Thompson wrestles with the idea that aggregators online (Facebook, Google) actually do have a very real interest in keeping users happy, and so it’s not entirely true that “you’re the product.” But he points out that Facebook and Google are effectively converting the entire process of the Internet experience into a data-refining process, and that is profit-driven, thereby justifying the need for some regulation.

The struggle, however, is that personally-driven advertising is not only going to be nearly impossible to shake off, it’s also beneficial. But that doesn’t mean it doesn’t have the potential to be insidious and harmful. Thompson writes an exhaustive analysis of what exactly Google and Facebook provide for the internet, advertisers and users while calling for better regulation and oversight. Go read it.

J611 Week 3 Journal

When They Spoke, People Used to Listen.

History before the Internet was about pillars.

  • pillar: there were only a handful of newspapers, and everyone read them.
  • pillar: there were only a handful of TV channels, and everyone watched them.
  • pillar: there’s only one Hollywood, and everyone watches its movies.

These pillars are collapsing. The question is what those pillars were holding up. Was the roof “an informed populace capable of productive democracy?” Because that’s certainly what a lot of people in the news-making and media-creating world believe.

But maybe the roof that those pillars were holding up was, “a captive audience led by the nose to the facts and conclusions we want them to have.” Some creators and thinkers are suggesting that the collapse of a singular narrative could be a good thing.

In Defense of A Collective Narrative

Ben Tobias, writing for Reuters:

However, I believe that there are two key reasons to think that television news is still of value to society; it has relatively high levels of trust, and, by (theoretically at least) appealing to a mass, general audience, it helps to define public debate.

For Tobias, television news is a pillar holding up a roof that is made up of citizens with the same talking points. For Tobias, it’s a good thing if the populace can be guided toward the biggest, most significant events of the day. They don’t have to be told what to believe or think, but they can be pointed to what’s most important so that everyone’s on the same page.

At its best, the TV news bulletin offers a contextualisation of current events, with a curated rundown and high-impact storytelling helping the viewer to make sense of the world around them.

[…]

Richard Sambrook, the former Director of BBC News, sees television news as a way for viewers to make sense of the world with the help of professional editors.

[…]

Television news has a role to play in guiding consumers towards relevant and genuinely important information.

An Attack on the Pillars

Rob Wijnberg, meanwhile, would probably disagree. Wijnberg is the founder of De Correspondent, a journalism outlet focused explicitly on not following the news. He wrote critically of the news media’s fixation on what is happening right now.

News is all about sensational, exceptional, negative, and current events.

And those five words capture precisely the problem with news.

He advocates for “reporting the climate, not the weather.” In other words, Wijnberg thinks that the pillar of singular news media was holding up citizens who were incorrectly under the impression that they were well-informed, just because they happened to know the most sensational piece of news happening somewhere in the world.

Not only does he look forward to that pillar coming down, but he thinks that journalists have actually created a negative impact by chasing the news.

Excessive news consumption predisposes journalists to believe that what’s happening in the world right this instant, and what’s the most important story to tell right now, is whatever’s getting a lot of airplay in other media.

It’s almost as though “news” is an ugly word for Wijnberg and De Correspondent.

Keep the Pillars Up, or Tear Them Down?

The public’s fascination with “news” is problematic. Most consumers know they should do better about their news sources in the same way that most know they should eat their vegetables. They’ll say it’s good to be responsible, and not enough people are responsible, and they’ll ask for “slow news” and better understanding.

But dangle a cookie, and they’ll go for it. That same public will want to be updated every three minutes about an unfolding news story about a terrorist bomb going off on the other side of the world with children and Americans in danger.

I think it’s bold for Wijnberg to say that his platform is going to provide nothing but vegetables, no matter how much his audience (and his own journalists!) want cookies. I hope it’s the way of the future.

Only Us Eggheads Wonder About Ethics

Meanwhile, the world is changing rapidly, and it’s hard not to despair that the people making the decisions about what the pillars are, what those pillars hold up, and whether or not we’ll move in an ethical direction are the ones who are thinking least about it.

Reeves Wiedeman writing for the New York Magazine wrote a pretty harsh take-down about the (lack of) ethics (journalistic or otherwise) at Vice magazine.

Wiedeman lays out how Vice basically conned Intel into funding a content studio which turned out journalism funded by the subjects of its investigation.

Vice, now settled in its new office, demonstrated to brands that rather than simply place ads next to its journalism, they could be a part of it — and that while this arrangement dissolved the traditional boundaries between publishers and editors, the audience might not even care.

Later, when Vice went for more funding, the question of ethics and propriety in journalism wasn’t carefully considered by the people in the driver’s seat. A potential investor was warned not to invest, that it wasn’t a safe investment.

In 2011, … the potential investor had in fact joined a round of investment in Vice… “He told me, ‘You were totally right, but the story is good, and we’re just gonna pass it on to the next guy,’ ”

Vice represents the reality of the facts on the ground: the internet is changing everything, and people are sprinting at top speed to get their hands on it. Those of us with the time and privilege to consider impacts, actions and ethics should remember that many of the real movers and shakers don’t appear to be as thoughtful.

J611 Week 2 Journal

Responsibility.

Do internet companies have a role in mediating conversations online?

At its heart, this question is fundamentally about responsibility. Which is unfortunate. Because “responsibility” sucks. Responsibility has two things going against it. One: way too many syllables to be taken seriously. Two: since it usually feels thrust upon us, everyone hates it.

Maybe the number of syllables isn’t that important. But I just feel like truth (1), justice (2), honor (2) — even integrity (4!) are just a bit easier to embrace because they don’t take up excessive tongue real estate.

What’s really problematic about responsibility is that it brings to mind cleaning your room, doing your homework, and cleaning the cat litter. When someone invokes responsibility, it’s all too easy to imagine the same person about to add something like “it builds character.” But seriously: disliking responsibility is a fundamentally dangerous and – frankly – un-American sentiment, but it’s pervasive and rampant.

Pokemon Ash GIF - Find & Share on GIPHY

 

The State of the Reading

It’s a bit depressing to read articles, think pieces and journals about the role of internet companies in society and expectations of them. Both because the problems we have before us are huge and feel insurmountable, but also because the solutions are nebulous.

But the worst thing of all about conversations about internet companies’ role is that it feels like most laypeople seem to be trying to avoid any semblance of the responsibility that average citizens owe to the discussion and situation. Of all the options, I dislike least people who are thinking about ways to educate and improve people’s literacy & journalism literacy, because it at least might do something about a line of thinking that goes like:

“Look at the state we’re in!”
“Look how we can’t talk about important things!”
“Look how wrong everyone on the internet is!”
“I’m sure it’s Facebook’s responsibility.”

I have a hard time with the thread of logic which leads people who self-select their news sources, convince themselves of their own moral superiority, use Facebook extensively, and un-follow brash people they disagree with to conclude that it’s someone else’s responsibility that our discourse is in the state it’s in. To wit: we did this to ourselves, we created our own situation. We planted these seeds; it’s now our job to harvest, or cull the weeds.

Sparks Fly Between Industry Leaders

I don’t know the solution. But I sure did think this conversation between industry leaders was fascinating. At a panel discussion, you have the CEO of NYT, Head of News Partnerships at Facebook, Editor-in-Chief of HuffPost, Google News Lab representative, and Director of Research for Reuters. They either know one another decently well, or they were unusually willing to get personal, because sparks flew!

Lydia Pollgreen, Huffington Post EIC:

  • “I sat in a room where Mark Zuckerberg used phrases like “broadly trusted” and “common ground” in a way that, frankly to me, felt downright Orwellian. I don’t want trust to be a popularity contest decided by users on Facebook. The idea that this organization could somehow decide who’s trusted and who’s not.” 

Mark Thompson, NYT CEO

  • “The idea that Facebook is transparent is ludicrous. It’s ludicrous! Maybe Facebook is going to become a transparent organization. It’s worth remembering Cambridge Analytica, and that Facebook’s first reaction when the Cambridge Analytica story surfaced was to take legal action to attempt to get it suppressed. And the kind of Mickey Mouse-world in which Facebook is in favor of transparency and publishers are not is nonsensical!”
  • “To be clear, what we don’t want is Facebook to set itself up as a judge around what political content is.”

Campbell Brown, Facebook:

  • “What conflates advocacy and journalism is when journalists do advocacy, and that’s happening a lot.”
  • “Yeah. What I hear is, you’re all for transparency from Facebook, except when it applies to you.” 

Shocked Celebration GIF - Find & Share on GIPHY

The speech did nothing for me quite as much as it revealed how complicated the problem is. The content creators want their Truth amplified. The platforms want their users to use the site/app more. When those desires align, things go okay. But what does the platform do when what the users want is bad in the eyes of the content creators? Or what does the platform do when the content being created by the “good actors” smells off-putting to the users?

Someone has to take responsibility for those questions, and too few people are stepping up to take responsibility for themselves and their own actions. But plenty of people are stepping up to suggest ways that other parties ought to be more responsible.

Eat your veggies!

As Sam Gill pointed out in his Medium piece revealing how — paradoxically — many Americans want more regulation for internet companies while using them more and more: “Or, perhaps, people want to eat their vegetables when it comes to staying informed, but acknowledge they’ll devour cookies if offered them instead.”

Vegetables GIF - Find & Share on GIPHY

Using Weak Analogies to Make My Point…

I think that in its simplest form, the difficulty I have placing blame on internet companies — or even wanting them to do anything about it looks like this:

Electric companies try to:

  • Provide as much electricity that people want to use
  • to as many people as they can,
  • as rapidly as they can,
  • in a way that keeps users requesting more.

Internet companies try to:

  • Provide as much content that people want to see
  • to as a many people as they can,
  • as rapidly as they can,
  • in a way that will keeps users requesting more.

If you plug in a faulty lamp and receive a shock, is the electric company responsible? If you use the Internet badly, is the internet company responsible?

I don’t think the internet companies are responsible for the state we’re in, and I don’t think regulating or legislating them will get us out of the hole we’ve dug. I don’t have a solution, but I certainly don’t want bright ideas about legal regulation of information getting in the way.

Meanwhile, in other reading…

http://www.niemanlab.org/2018/10/more-research-suggests-that-twitters-fake-news-strategy-is-either-ineffective-or-nonexistent/

“A report released by Knight on Thursday finds that most of the accounts spreading fake news on Twitter during the election are still active today — and that “these top fake and conspiracy news outlets on Twitter are largely stable,” because Twitter has not banned them.”

So that’s heart-warming!

J611 Week 1 Journal

Looking Back, or Last Week’s Lecture(s):

So it turns out the internet’s actually a pretty big deal.

What I took away from Dr. Radcliffe’s presentation last Monday is that “media” has been disrupted. Internet tore into the field, mobile phones are in everyone’s hands (who knew?), and there has been a sea change of who produces, who consumes, and how it’s consumed. But most importantly, we’re all still trying to figure out what those changes are, and what those changes will mean.

What Even Is Media, Though?

Obviously, human culture has a long tradition with creating and presenting art. And pamphlets and books aren’t entirely new either. Plus, for more than 100 years, we’ve had newspapers or something like them, and the last century has seen radio and television boom.

But I don’t think “media” has always had such an outsized role in society. To clarify: I’m NOT implying people weren’t always hungry for news, or they wouldn’t go to great lengths to be informed. What I am saying is I think it’s safe to say that compared to today, the different media available to the average person, and their methods of receiving it, were lower than ours today by orders of magnitude.

This is the generation of media. And “media” — turns out — is super hard to define. It’s… a self-perpetuating creation of a vicious virtuous vici-tuous cycle of technology, culture, conversation, politics and generational divide.

Here’s what I’m trying to say. Technology has changed our media landscape, just as media has changed our technology landscape. Dr. Radcliffe’s presentation did a much better job than I am of capturing the complexity of this moment, and some of its ramifications and measurements.

Meanwhile culture, politics, identity, generational differences and more have converged to make this an intersectionally impossible time to categorize. At least that’s what I was beginning to see at the end of the two professor’s presentations last week.

Theories Upon Theories

Meanwhile, I also enjoyed hearing about the theories behind media studies, the attempts to create a vocabulary, some rulesets and predictions. My undergrad degree is politics and international relations, and that’s a field that loves its theoretical models for interpreting socio-cultural effects.

For me, the broad strokes of the history of the field was the most accessible and understandable part of the conversation. Just as the technology and medium of media has evolved over time, so has the conversation around it — and, maybe most importantly for theorists, the conversation about the conversation.

I thought that the main difference between the two lectures and presentation was of slightly longer-term analysis and consideration on the one hand, and immediate, observational conclusions on the other with Dr. Radcliffe. Speaking in terms of personal preference, I think professional master’s will benefit more from that approach: contemporary analysis and consideration of modern trends. I like being anchored in theory, but I don’t relish diving into it.

Looking Ahead, or This Week’s Readings:

A Great Week of Reading for Children of the 90s

I’m 31 years old. That puts me at just the right age to remember the excitement of my high school finally getting 56k internet. I usually find it difficult to wrap that experience up in words, but there was a giddy joy that me and my fellow sci-fi nerds felt imagining what the future would be like.

Many of this week’s readings confirmed what I’ve sort of always known: if we’re all waiting for flying cars before we say we’re in the future, we should probably stop waiting. The way media is created and consumed, distributed and monitored alone amazing. The articles about Golden Ages of Television are a great example of this – it’s a multi-faceted moment to try and explain, but it’s all changed.

I was older than 13 when I got my first cell phone, and I never committed to a truly “smart” phone until I was nearly 23. But I remember what it was that convinced me: a friend showed me how his smartphone was able to use the phone’s gyroscopes and camera to track the position of celestial bodies and show and identify constellations, planets and stars in real-time — as you looked at them.

It’s cliche — even trite — to say “the future is now,” but I find it helpful to frequently stop, take stock and remember how many previous advancements were making us say, “the future is right around the corner.” There was so much in the 90s and early 00s that made us say, “Once we can do [blank] with our technology, then the future will be here.” And this week’s reading helped me to again take stock and say, “maybe we made it… maybe [blank] is finally possible.”

News Media’s Fixation on “Objectivity” is a Flash in the Pan

For me, the thing about The Economist’s reporting that was so impactful were the two articles that described the very, very long history humans have with not only highly dispersed “news reporting” but also extremely biased sources of media. It’s a very, very recent trend to think that where you get your information from isn’t potentially a “tainted source.”

For the most part, you were getting your news from a drunk guy at the local tavern, who heard it from a minstrel who got in from a two-week trip from the nearest city hub, who was maybe paid by the local feudal lord to spread a war fervor.

To that end, I’ve always felt — and have felt growing — a sense that the “golden age of pure journalism” is a misconception. The idea that there is, was, or always has been a group, class or profession of people who pass along information objectively, with a pure underlying sense of fairness is laughable at best.

The control, creation and spreading of information has always been driven by political and/or personal gain. America’s experience with monolithic news organizations that claim to commit themselves to truth, justice and the American way is a flash in the pan. But it’s also not exactly true, anyway.

No one knows anything, we’re all in it together

I loved Benedict Evans’ presentation on technology trends (well worth watching/listening over reading). It was the only thing we saw this week that put any amount of order to the chaos that most other analysts and writers were describing. We’re in the middle of a total shift, and few people have a sense of what it means, or where it’s going.

TV has changed, radio has changed, movies have changed (more than once), commerce has changed (over and over again), production models change, finding creative ways to extract money from the masses is always ongoing. One might go so far as to say that the only constant in it all is change.

The common theme through so many of the articles as also how much everyone is struggling to make money. No company wants to just pack up and go home, and so the struggle to define what works, what doesn’t, and how we can make money is very, very real.

The Latest, or my Favorite Readings this Week:

Over at Stratchery.com, Ben Thompson talks about tech trends and performs analysis on new media events. His big piece this week was about the resignations of Instagram co-founders, and what he suspected the reason for that was.

The entire article is worth a read if you’re interested in the money-making side of tech startups, particularly product-driven companies like Instagram. Throughout the blog post/article, Thompson explains the challenge of making money when all your company actually does is support an app that shares photos.

He compares Facebook’s timeline of creating a broad user base and then selling ads, compared with other online social apps.

Instagram’s choice to market “Instagram Stories” as a flagrant response to Snapchat is covered, as well as its relative success.

The TL;DR version? His last paragraph:

This is the context for whatever dispute drove Systrom and Krieger’s resignation: not only do they not actually control their own company (because they don’t control monetization), they also aren’t essential to solving the biggest issue facing their product.

I really like Thompson’s writing. He’s clear, direct, understandable and just this side of casual while still obviously knowing his stuff.