Where Did The Future Go?

When I was a child, my optimism was boundless. Infinite wonder surrounded me, and I saw it as exactly that. Life was for exploration! And I was lucky. I grew up as the world built and developed greatest territory ever imagined: the internet.

Then I grew older.

I never lost the sense of wonder. I count myself lucky for that. But all the complicated entanglements became obvious to the point of near-preeminence. "Here be dragons" was no longer a beacon to run to. It was a serious warning.

I put away childish thoughts and reasons.

At least, that's what I deluded myself into believing. The truth is something far worse happened: the maps got filled in and I mistook them for the complete territory. They are not. They never were. They mustn’t ever be. Thinking otherwise is a dangerous intellectual trap that often accompanies familiarity and expertise. The true Siren's call that smashes your ship to bits on unforgiving rocks. History may rhyme, but it need not repeat. Not only is the map not the territory, but there is no fixed territory. That's what it means to be human: we have the near-magical ability to imagine and create new ones.

That's also what the internet was to me as a child. It wasn't a place — it was a substrate. It was an æther for human creativity. Promethean Fire channeled by "the electron and the switch." An everywhere and No Where I entered in order to explore and discover ideas, people, and possibilities — and to realize them!

The internet is still that place for me. But the future I imagined would have arrived by now never did. Don't get me wrong, people built some truly incredible things. My workstation really is a bicycle for my mind; my iPhone truly does make my life better; and, whenever Twitter goes down — when we suddenly find ourselves unable to think the same thought at each other across the world all at once over a damn near philotic web — I get reminded that the social world is unfathomably more accessible than it ever was before. And yet, I think it is all a stunted thing in comparison to what I thought it would be.

This has been a lot of been abstraction and aphorism, so let me try to be more concrete and specific: I think the great force of future's diversion was cloud computing.

Cloud computing is awesome, in both senses of the word. By invoking the correct spell, I can marshal and command computational resources well beyond those I imagined even a decade ago — and I can do so for surprisingly little expense. Jeff Bezo's was right: the internet is like electricity, and infrastructure as a service means you can do more with it.

But there is an expense paid that's often obscured because it's not currently measured in dollars. Cloud computing — especially when coupled with ad-supported business models — selects for particular data flow patterns; and, in turn, those data flow patterns select for particular computations. So developers don’t spend their time building smart and powerful agents to assist, extend, and act on behalf of the people they serve; developers build extraction engines -- mechanized means for filtering signal sampled from a tiny portion of the spectrum of our possibilities, narrowed by historically-imposed constraints. It's not a malevolent process. It's the evolutionary outcome born to a series of realized happenings.

But it need not be that way.

Let me give you an example to entertain. I am what you would call…an active twitter user. I use it like many other extremely online twitter users. But I also use it differently than them. I have the technical skill to create different views than the one twitter restricts me to — the one their architecture and business model demands.

  • I use a graph database to see how people relate to each other. It adds another dimension to the experience, often an illuminating one. When you can condition expressions graphically, you can see the contrasts between groups of people. You can see with a little more clarity what people believe because they believe it as an identifiable people versus what they believe because it's directly-coupled to their experience and knowledge.

  • I use a full-text search engine over previous tweets to look at the progression of someone's expressions. It adds another dimension to the experience, often an illuminating one. In isolation, a particular tweet may resonate, but it doesn't help you judge who they are or estimate what they believe. N of one is the lowest of bounds. And social media means — algorithmically curated or not — your sample is intrinsically biased. The things your most likely to observe? Those you're most likely to agree with strongly or reject vehemently. If you want perspective, you need the ability to zoom out, an affordance not provided by the offered client.

I spent the better part of a year trying to figure out how to offer these services broadly to people. My conclusion was that I can't. It's a fool’s errand. Twitter actively doesn't want you writing clients and I don't want to build a sandcastle on someone's private beach. Best case scenario? They don't knock it down for a while. But even more critically, I don't think anyone could make a twitter that works the way I and others may want using an ad-supported business model or an efficient data flow architecture, let alone both. That's the evolutionary dead-end we find ourselves in through the lens of one small but (hopefully) legible example.

So yes, it is time to build. It's always time to build! And, I know what I'm building now.[2] The internet is a decentralized and distributed network of people and resources. I think it's the greatest thing we've ever built. But, ironically — and, tragically, too—we failed to maintain peer-to-peer as the primary conduit. This matters because how we interact selects for what we perceive. More than that, computation itself is a medium. Hell, it’s THE medium. As more people learn to code and as no-code becomes more powerful, how we interact also selects for what we can achieve. As that tide continues to rise, I want to make sure it lifts us all. I want to help push us out of this local minima.

The future never arrived because we failed to distribute it correctly.


  1. Photo by James Copeland / Alamy Stock Photo. It’s actually pretty shitty that a platform for creative people still has no clean means of attribution for use of a different type of creative person’s work.

  2. I don’t mean this metaphorically. This whole essay is me waving a flag. I applied to YCombinator hoping to build a business motivated by these exact ideas. If I get in, I’m going to build it; if I don’t, I’m going to build it. In both cases, I need help. Give me infinite time, I could do it myself. But I don’t have infinite time, and I don’t think we do, either. If you’re a distributed systems engineer, cryptographer, network engineer, virtualization engineer, or product developer, please reach out to me on twitter or via email at jbn@abreka.com. (Note: I’m unlikely to respond until next week because now I have to go back to playing an important albeit much less interesting game.)

Ambient Intimacy is Intimacy

Image result for warm glow light

Ambient intimacy is seeing twitter's "What's happening?” prompt and answering, "How was your day?" instead. Given sufficiently personal expressions, what accretes is “real” intimacy — just digitally-transmitted instead of verbally-expressed over a kitchen sink. And even in the case of one-sided communications, granting the emitter your attention gives them access to the same cognitive machinery.

"I've never met them but I know them."

More often than not, you're not wrong.

I've met a lot of people through twitter. Generally, they match my pre-IRL expectations. The ones who don’t use tend to use Twitter as if it were LinkedIn but without suit-jackets. They dispatch TwitterCard-sealed news from their particular domain, and they share little personal information. The one’s I do recognize upon first bio-meeting? They aren’t using twitter as a social bookmarking service — they’re being social through self-expression.

“I never met them but I know them.”

Admittedly, Twitter is a lot of different things to different people. And, it’s often a confusing place because limited context makes it hard to distinguish what mode someone is in when they tweet. But, oddly enough, things get easier with ambient intimacy. When you have a personal model of someone in addition to an intellectual one, you see more of them. That doesn’t mean you won’t be wrong. But it does make the context incomparably richer — and, social.

That’s the real reason I care about anti-abuse generally and community boundary enforcement, specifically. Hat-tip to J. Corwin for introducing the term to me, but we live in a noosphere.

There are lots of similar words used to say the same thing about our increasingly hyper-connected information environment, but I like this one because, to me, it conveys how new and weird things are. We’re only at the beginning of what’s even possible, let alone what’s necessary.

And, I think social networks designing for ambient intimacy is a necessity. The distinction between “irl” vs “online” continues to erode. That richer model of what someone is saying that ambient intimacy affords? It’s error-correction repurposed from the former environment and adapted for the latter one. It helps us correctly receive what someone really means to say. Even if what they’re saying is silly, that’s useful.


Postscript: I’m not working on reflock — what was meant to be my twitter anti-abuse tool — anymore. There are a lot of reasons for that but it looks like Tracy Chou’s app mirrors a lot of what I was implementing. I haven’t used it so this isn’t an endorsement, but please signup for the waitlist if you have abuse problems. I hope it works well because I want you to share yourself freely!

The Real Supernode Problem

Barack Obama Tweets

At the time of writing, Barack Obama has over 110,000,000 Twitter followers.

Barack Obama is a supernode.

[Image caption: A digraph that has 199 nodes dependent on 1 central one that I did so this post has a media card for embedding but that, by happy accident, created some pixelated artifacts that resonate with the spirit of this post.]

The Technical Problem

As a service specification, assume:

  1. Subscribers (i.e. followers) should observe an Obama tweet quickly (low latency).

  2. No subscriber has privileged access in that all users share the same expectation of observability (fairness).

As a world-scale, real-time, distributed (albeit centralized) platform, their developers spend considerable resources on engineering low-latency bit plumbing. And, in the practical terms of end user experience, the distance in time between action and feedback matters.

But, consider the challenge of materializing Obama’s tweet in subscriber feed,

  • If you manage an on-disk circular queue of tweet ID’s for each user timeline and punt the full content to lookup requests on caching infrastructure, you’ll have to fan-out a 64 bit snowflake (tweet ID) to 110,000,000 disk or memory (or both) locations. That’s nearly a gigabyte of instantaneous writes.

  • If you were to send the full tweets instantly to every connected, subscribing client, N, and assumed that the size of the delivered tweet was 296 bytes — 280 characters plus two 8 byte integers for the snowflake and user ID — you’re writing (296 x N) bytes of payload alone to your network wire. With a million users, that’s about 280 MiB ignoring network overhead.

These are extremely naive simplifications. There are much smarter ways of doing things that account for social network dynamics and use reasonable heuristics.[1] But, it should be clear that the technical problem is not trivial, especially once you recall that other users are also active at any given moment.

As for (2) — well, realistically, it’s an intractable specification. Assume Obama tweeted from his Kalorama residence while connected to the nearest data center in Virginia.[2] My friends in Maryland will probably receive his tweet before me. In reality, packets traverse complex routes before arriving at their digital destinations. But, it’s reasonable to simplify by saying whoever is closer as the photon flies gets their 280 characters first.

There are technical means of mitigation (albeit with wild uncertainty and impressive computational expense). If you had a map that associated every subscriber with a distribution of their read latencies conditioned by datacenter as well as one for inter-datacenter latencies, you could use both to randomly select a distribution path for each tweet such that the shared expectation converges on a single value for all subscribers.[3]

Alternatively, you could discard (2) by asserting: sub-second arrival-time heterogeneity and the associated information arbitrage opportunities (and risks) are not the platform’s concern — which, I think, is perfectly reasonable.

Twitter isn’t an enterprise-grade messaging queue. It’s a social networking service.

The Social Issue

Taking a glance at twitter’s engineering blog, you’ll quickly discover that Twitter cares deeply about (1). I have no idea if they care about (2) — I just made it up. It’s reasonable-sounding only if you’re thinking in the wrong context or too many of them. Fairness is a laudable concern and entities like high-frequency traders do seem to systematically exploit technicalities in parasitic ways, but complexity is expensive and the boundary separating problems you can attend to from the fantasy land of those you wish you could is much nearer than you think. If you want an aphorism: pick your battles. If you want a philosophy: do one thing, and do it well.

So, what is Twitter’s one thing?

Simple: Acting in accordance with its fiduciary duty to investors.

This isn’t meant to be a provocative statement. Given that twitter is a corporation, it’s a legally-enshrined obligation. And, as a publicly-traded company, there are even more commitments attached. Of course, twitter does have to satisfy some demand, lest the users go elsewhere. Setting aside manufactured demand, you can simplify twitter as a multi-sided market for attention:

  • advertisers buy it;

  • twitter sells it;

  • users offer it in exchange for that delicious content.

Certainly, there is justifiable cause for concern with adtech. And, as someone who seems to be perpetually aware of squandered attention yet capable of resisting anything but temptation, I have serious issues with attention allocation under this particular business model (as implemented). But in stark terms, I don’t think twitter could have existed in any other form, historically-speaking. And, if you follow my ExTrEmElY oNlInE life, it’s obvious that get a lot out of it.

So, what’s my problem? Well, you see dear reader, I’m a true believer.

I think the reason for twitter’s existence, specifically, and social networking services, generally, should and can be removing the constraints of time and space from human social interactions! To facilitate the discovery of ideas, people, and connections that otherwise might not be possible! To expand the set of what we may perceive!

This may be romanticized view, but I’m a romantic at heart and this is my newsletter that your subscribing to, so I assume I’ll be forgiven any (over-)indulgence here. But, what does this have to do with supernodes? Well, at least for me, they’re one of the clearest examples of engineers pursuing (brilliant) technical solutions to mischaracterized problems in compliance with business model demands and while ignoring the social ones. That is, allowing supernodes is a problem by itself because,

SUPERNODES RESTRICT OUR SOCIAL ATTENTION.

This isn’t to say that I believe what Obama has to say doesn’t matter or that it doesn’t warrant my attention. I just don’t think it’s reasonable to computationally-encode and grant anyone that level of write access to our collective attention on a social medium. That’s not a social interaction, it merely simulates one. And in the confusion, all manner of negative consequences manifest.

Admittedly, even with technically-imposed limits — say a 30,000 follower maximum capacity — I’d still be aware of Obama’s tweets. As the 44th President of the United States, what he says is culturally and politically relevant, and it would traverse the social graph quiet readily. But the difference between percolation and direct distribution is vast — or, at least, I believe it is. I want a social medium that facilitates social interactions and discovery, rather than one that further entrenches the pathological hierarchy users initially (at least, I hope), meant to escape.

TL;DR?

Fuck supernodes. We don’t need them. Social tools should design against them.[4]


  1. For example, see: Feeding Frenzy: Selectively Materializing Users’ Event Feeds.

  2. I don’t actually know where twitter’s datacenters are located.

  3. This convoluted method is, hilariously, an absurd and still-wrong simplification.

  4. For all I know, Twitter does try to attenuate supernode signal strength. I’d applaud them if they do. But, the thing is: I can’t inspect their code, and unless Blue Sky is successful, it’s still a proprietary protocol that makes delivery opaque. I want agency and transparency, and Twitter limits both in this context. In any case, I absolutely hate footnoting the concluding sentence, but it’s the right thing to do.

May I have your attention, please

Allow me to introduce myself

My computer starts each (west coast) day by automatically tweeting,

The quote itself comes from a talk Simon gave in 1969 titled, Designing Organizations for an Information Rich World. Having completed a Ph.D. in Computational Social Science (CSS), Simon was unavoidable. And, rightly so! But — as someone whose attention and interest seems to experience an almost infinite regress absent careful environmental shaping — this quote became a mixture of mantra and mission to me. The first time I read it was the first time I seriously asked, "well...are we designing for an information rich world?"

I don’t think we are.

We certainly live in an information rich world. But I think the operationalized design principle is quiet different. While there is a(n often realized) danger in hyperbole, I don't think it's hyperbolic to state plainly: attention is both for sale and frequently sold.

This isn't a new phenomenon.

The competition for attention itself is part of the Darwinian struggle. The existence of markets for attention — formal or otherwise — is old. That's not inherently bad. And, I'm not sure how much the proportion of our attention actively allocated by markets has changed over time. But, I also think this line of reasoning misses a critical point. If "software is eating the world," why can't it also better manage my attention?

I don't have an answer yet.

My knee-jerk reaction is that, it's easier to sell freely-given attention than to demand money for a service that allocates it in a user-controlled way. In the context of social technologies, this is especially true given the logic of Metcalfe's law. Better design principles may exist, but evolutionary pressures on corporate entities select for ones that forge exponential moats the fastest. Alternatively, maybe we do have great tools for attention allocation, it's just that "work expands so as to fill the time available for its completion." In either case, I want to ask: Is this the best we can do?

I don't think so.

I'd like to find out.

Knowledge discovery and curation is an inherently a social activity. Even if you could resign yourself to a (boring) life of quiet isolation, you wouldn't want to! From "don't eat those red berries," to "check out this fascinating research paper," we is greater than the sum of the I parts. But, there is room for computer-mediated improvements, and I intend on contributing to that effort.

Perhaps this means I'm building and hacking on niche products. Maybe my struggle to remain focused really does reflect some inherent deficit which demands specialized design. But, I doubt it. "Information overload" is a problem so common that it requires no elaboration for clear articulation. If you feel similarly afflicted, subscribe to this newsletter, where I'll share what I find along the way.

Loading more posts…