Categories
Technology The Internet

What makes RSS better than social timelines?

Replied to The Fail Whale Cascade by Luke Harris (lkhrs.com)

I’m bored of what I call “the timeline era”. Scanning an unending stream of disconnected posts for topics of interest is no longer fun, I prefer deciding what to read based on titles, or topic-based discussion.

I am a huge fan of RSS and have never stopped using it to follow blogs and webcomics. But lately as I’ve read lots of people talking about timelines, a question has been niggling at me: what does make an RSS feed* feel better to use than “the timeline” of social media? They are both streams of information, but I prefer RSS.

*by RSS feed, I mean the stream composed of multiple individual feeds — it is a little confusing that the singular and plural/collective of feed are the same.

Continuing in the vein of exploring what makes a blog a blog, I’m curious why an RSS feed feels better than social media timelines. Are we conflating our like of blogs with a like of RSS, or is there something about RSS feeds inherently that we really do prefer to other timelines?

I think it’s useful to dig into what elements of the experience make a substantive difference, so we can make better design choices with new tools in the future. I’m interested not in the technical details here (yay RSS is open and not owned by a corporation, boo it’s kind of a pain to explain and set up) — I’m interested in how we use the technology, and how we feel about using it.

Categories
Business Entrepreneurship Relationships Society

Build a reputation instead of a personal brand

Replied to The personal brand paradox (wepresent.wetransfer.com)

When we position ourselves as a brand, we are forced to project an image of what we believe most people will approve of and admire and buy into. The moment we cater our creativity to popular opinion is the precise moment we lose our freedom and autonomy.

But rather than manufacturing a personal brand, why not build a reputation? Why not develop our character? Imagine what we could learn from each other if we felt worthy as we are instead of who we project ourselves to be.

I think it’s interesting to look at personal brands through the lens of insecurity. I imagine many people think of it as “positioning” or storytelling, but underneath, those are needed if you’re afraid you won’t be enough on your own.

I think it can be helpful to consider personal branding as a form of self discovery, a tool to help determine what you want to do, but there can be a risk of self containment.

I think of my other blog, Cascadia Inspired, which I started ten years ago as a way to get to know the Pacific Northwest better. I bought into the idea that blogs need to focus on a particular subject area or no one will read it. While I’ve enjoyed writing there, to some extent it created a constraint around what I felt appropriate to write about. For example, I didn’t publish photos from anywhere outside the northwest, so I have all these southwest trip photos I’ve never shared but on Instagram maybe.

Likewise, I had created a portfolio website at tracydurnell.com, and felt obliged to leave it serving solely a professional purpose. When I let go of that and transitioned to this blog-like format, allowing myself to write about whatever I wanted, I started writing so much more. I hadn’t realized how much I was holding back.

I still don’t expose my entire self here, but I’m much more open and vocal about my opinions, and more willing to risk publishing imperfect posts that show my incomplete thoughts in progress. I’ve held myself back and quiet for too much of my life already.

I’ve also realized I’m more interested in following people as people — while I might have been drawn to certain blogs in the past because of the topic, the reason I keep reading many of them is having gotten to know the writer. For example, I used to read Get Rich Slowly, but stopped when J.D. sold it (he’s since bought it back). I lost a lot of interest in Design*Sponge when my favorite writers there moved on to other things, and looked mostly to Grace Bonney‘s articles. Even though she’s moved on from writing about design, I’m still interested in her work.

I find myself drawn more to what individuals are writing than publications; if others are like me, all the publications who treat their staff as disposable and interchangeable will be in for a rough ride when they try to replace them all with AI churn content. Sure, you’ll pick up some SEO shit clicks, but that actively breeds distrust instead of long-term readership. I read my first Ed Yong article because I was interested in COVID; his thoughtful writing and reporting earned my trust, so I started following *him* on Twitter — not The Atlantic. I read Annalee Newitz back on io9, last year I read their non-fiction book, this year I’m looking forward to their next fiction work.

This is what makes self publishing viable for journalists and writers: people following them for them, not for their title or brand. When writing for a brand constrains these writers, good for them to split off and start their own thing where they can write about what they want, how they want.

Categories
Learning

The point of reading

Replied to The Imperfectionist: How to forget what you read by Oliver Burkeman (ckarchive.com)

This is an understandable response to the information environment in which we find ourselves, I think. After all, there’s just so much useful and interesting stuff out there, and so little time, that it feels incumbent on us to take ownership, so to speak, of the little we do manage to consume – either by literally memorising it, or storing it in some well-organised external system. Otherwise, wasn’t reading it in the first place a waste of our precious time?

This utilitarian perspective is easy to internalize in productivityland. But it shares the same core as the mindset that books aren’t worth reading, that truths ought to be distillable down into a short listicle, that fiction is a waste.

I suspect part of the urge to read more, learn more, is related to self-doubt. When we lack confidence in our opinions, when we lean on quoting others instead of using our own words, it’s rooted in fear that we are not enough. We seek more information to affirm our beliefs; the quest for certainty is a classic expression of anxiety. As a recovering perfectionist, I have suffered from difficulty making decisions and lack of confidence in my choices that I hoped learning more and practicing more would resolve. (Obviously it’s a balance — learning nothing and basing opinions solely on vibes isn’t a great approach either.)

It’s easy to operate on the assumption that the main point of picking up a book – a non-fiction or work-related book, at any rate – is to add to your storehouse of data, hoarding information and insights like a squirrel hoarding nuts, ready for some future moment when you’ll finally take advantage of it all.

 

But that’s a recipe for living permanently in the future, never quite reaping the value of life in the present moment. Better, I’d say, to think of reading not as preparation for living later on, but as one way of engaging with the world, one way of living, right here in the present.

[T]he point of reading, much of the time, isn’t to vacuum up data, but to shape your sensibility.

👏👏👏

Sometimes we should trust the vibes. Our individualist perspective means that each person is expected to become their own expert in every topic so they can have “informed opinions.” Instead, what if we let ourselves lean on community as well as expertise to guide us? Accept that we cannot master all subjects, and don’t need to hold a strong opinion on everything. I want my nonfiction to have opinions, not pretend at neutrality. And I think that’s linked to what Burkeman’s talking about: we’re choosing whose opinions to listen to when we read an article or a book.

Categories
Science Society Technology

When “ambiguity is a feature, not a bug”

Replied to Pluralistic: Netflix wants to chop down your family tree (02 Feb 2023) by Cory DoctorowCory Doctorow (pluralistic.net)

Suddenly, it was “computer says no” everywhere you turned, unless everything matched perfectly. There was a global rush for legal name-changes after 9/11 – not because people changed their names, but because people needed to perform the bureaucratic ritual necessary to have the name they’d used all along be recognized in these new, brittle, ambiguity-incinerating machines.

Digital precision

We encounter this problem often in the digital world in things like content-limited text fields and binary choices on a form (or limited options that drive us always to “other”).

The digital world demands exactitude in a way analog doesn’t. I recall my dad, a TV station electrician, explaining the difference between analog and digital signal to me as a kid; I couldn’t understand why the squared shape of digital signal — either you get it or you don’t — would win out over more flexible analog signal, which has some allowance to receive lower quality signal rather than none.

Too, this inherent precision of digital information influences the way we think about data. We interpret numbers to be more meaningful than they are:

Excel-calculated results down to four decimals falsely imply confidence unsupported by the input data.

Recipes call for a specific baking time, when everyone’s oven is a little bit different, and environmental conditions affect baking time by impacting the moisture content of the ingredients.

Ad metrics and pageview data and likes that don’t translate truly to reach or brand recognition or conversions. (Like Internet celebs with millions of followers getting book deals that don’t translate to sales.)

Ambiguity of knowledge

Information that should be more directional than exact is treated as gospel. “The numbers don’t lie.” (Well, actually…)

Anyone who’s collected scientific data is aware of the messiness of reality that must be translated into the concrete as “data.” Theoretically, methodology codifies the decision-making matrix researchers follow; but given the scientific reproducibility crisis, it’s clearly a tough job. Give five writers the same prompt and you’ll get five different stories; can you be certain five researchers will record the same value from the same observed reality? It is a tricky thing, as a communicator, to acknowledge the limitations of what is knowable and to what degree, without implying artificial uncertainties to be exploited through mis- and dis-information. (I know those are the terms we use nowadays, but sometimes I’d just like the plain language “lies.”)

Who determines reality?

As Doctorow points out, digital condenses complex reality into defined fields — and the people defining the fields are those in power / the elite. Powerful, controlling cultures demand that their perspectives be codified.

The “Shitty Technology Adoption Curve” describes the process by which abusive technologies work their way up the privilege gradient. Every bad technological idea is first rolled out on poor people, refugees, prisoners, kids, mental patients and other people who can’t push back.

Their bodies are used to sand the rough edges and sharp corners off the technology, to normalize it so that it can climb up through the social ranks, imposed on people with more and more power and influence.

When [Netflix] used adversarial interoperability to build a multi-billion-dollar global company using the movie studios’ products in ways the studios hated, that was progress. When you define “family” in ways that makes Netflix less money, that’s felony contempt of business model.

Netflix is careful to stick to the terminology “household,” but I suspect to many, household implies family. I know a married couple who live in different parts of the state for work; would you not consider them a household in how they run their finances and make their decisions? It is easier to justify a physical utility like Comcast requiring a connection at each physical location versus a digital service like Netflix that is not location dependent. This is true too for ebooks, which have fucked libraries royally by pretending a physical book could be loaned only twelve (?) times (lolololol I worked at a library back when we stamped checkouts and lemme tell you, those stamp slips had space for like forty checkouts, and often the book was still going strong when the slip was full), and individuals by pretending it’s only possible to loan a book to a friend once in a lifetime. Digital product corporations want the limitations of the analog with the benefits of the digital. The elites setting the rules want to have one account they can use at their multiple homes, but not for the poors whose families are spread across multiple dwellings to be permitted to share.

Categories
Art and Design Future Building Reuse Technology

Retrocycling as entry to creative reuse

Replied to Field Notes: Why It’s Time for “Retrocycling” – Immerse by an author (Immerse)

Over the past decades, I have explored different approaches for repurposing outdated technologies, including video game consoles from the 1970s; TVs and slide projectors from the 1980s; CD players…

Loosely skimmed the article but love the idea of retrocycling as a way of thinking about reuse and product lifecycles.

Could you host a workshop/ course to encourage hacking old tech?

An interactive event / pop-up space with old games and equipment?

A photography / documentary project where people could celebrate hand-me-downs by sharing their stories? Or a website where people could post their stories?

Categories
Finances Political Commentary

The old classic, lying with statistics

Replied to Exaggerating China’s military spending, St. Louis Fed breaks all statistical rules with misleading graph (geopoliticaleconomy.com)

The Federal Reserve Bank of St. Louis published a jaw-droppingly misleading graph that portrays China as spending more on its military than the US. In reality, the Pentagon’s budget is roughly three times larger.

In an accompanying report, the St. Louis Fed admitted that China’s 2021 defense spending was just 1.7% of GDP, “which was the lowest share among the six nations in the figure”.

Yay! I love Actual Propaganda! With a good ol dose of racist fearmongering 🙃

My Biostatistics teacher in college devoted our entire first lecture to discussing ways you could lie with data, so we would be better able to recognize it — and hopefully, not do it.

If we acknowledged how much we waste on bloated military spending, we would have to come to grips with our spending priorities. We would have to acknowledge what we don’t buy with that money. Some of that money could help stop children from going hungry, or keep diabetic people (who aren’t on Medicaid) from dying for lack of affordable medicine 🤷‍♀️ (To name some real problems in the US that shouldn’t be controversial yet somehow are.)

A much more accurate graphic created by the Peter G. Peterson Foundation shows how, as of 2022, the United States spent more on its military than the next nine largest spenders combined – including China, India, the UK, Russia, France, Germany, Saudi Arabia, Japan, and South Korea (and several of these countries are close US allies).

Some of what our $$$$$$$ military spending buys is impressive: a rapid response force that can be wheels up in under 18 hours (the logistics of that alone are mind-blowing), a sophisticated anti-tank weapon that still beats out everything anyone else has and is making a huge impact in Ukraine, and development of GPS.

Preserving self-governance in Ukraine A+++++++ But mayyyyybe we could spare some of the $850 billion we’re spending on the military this year to care directly for people?

Categories
Society Websites

Blogs are a platform for normal people

Replied to Understanding blogs | Tracy Durnell by Murray Adcock.Murray Adcock. (theadhocracy.co.uk)

I am a big fan of categorisation debates, so the concept of trying to define what a “blog” is (or isn’t) piqued my interest.

Further exploring what makes a blog a blog — which I agree I haven’t quite landed on yet:

The fact that blogs take the form of a building argument, not necessarily voicing their intent or conclusion immediately, but instead guiding the reader through the narrative to naturally arrive at that conclusion. I agree wholeheartedly with this take, but I’m not sure that this is the essence of “blog-ness”. I think that’s just how people actually talk when given a platform.

(Emphasis mine.)

This connects back to the democratization of self-publishing, leading to greater influence of oral culture (as you point out).

The word “given” here got me thinking — like the soapbox example, blogging is when people create and claim a platform for themselves. The work is self-motivated. No one’s telling us what to blog about. It’s not fulfilling an assignment. The things people blog about are the things they care about enough to spend their free time considering.

And because it’s not “for a purpose,” because it’s self-directed, a blog post needn’t fit a formal format. A lot of blogging really is ‘talking through ideas’ in text, in real time — the thinking and writing happen together. (Or at least it is for me, though I’m sure it’s not the universal blogging experience 😉) Even when a post is edited before publishing to center a specific conclusion reached through the drafting, a tenor of curious exploration or earnest passion often carries through.

That’s part of what makes a lot of content marketing so vapid and noxious: not only is it hollow of meaning, but it’s uninteresting signalling barely disguised as thought. It’s the writer regurgitating what they believe other people want to read about, or what they think will make them sound smart or good or clever. (Not that self-motivated blogging doesn’t have some measure of this, as all public writing does, but blog posts generally don’t feel calculated and perfunctory the way many churn pieces do.)

Blogs tend to be personal spaces (or places attempting to make themselves appear personal, as with brand/ business blogs) that give a person or persons a platform, but one which they want others to consider.

(Emphasis mine.)

This makes me think of imitation bees: the corporate blog tries to pass itself off as a Real Blog by looking like one at first glance, then once you start reading you suspect ‘someone’s been hired to write this’… A lack of feeling, an unwillingness to voice opinions, an empty ‘we’, a cautious and bland tone, become apparent when writers produce for a brand that wants to gain the SEO benefits of a blog without risking expressing any personality. They want to give the appearance of sharing knowledge and participating in community and conversation, but those are positive externalities to their goals of drawing traffic, building reputation, and ultimately selling widgets. I wonder whether I’m being too inclusive in accepting everything that claims to be a blog as a blog…

Categories
Learning Reflection

Follow your curiosity deeper

Replied to The Power of Indulging Your Weird, Offbeat Obsessions by an author (Medium)

It’s enormously valuable to simply follow your curiosity—and follow it for a really long time, even if it doesn’t seem to be leading anywhere in particular.

This reminds me of when I traveled to the Mediterranean after high school; my coach didn’t think we were exhibiting enough Wonder as we encountered history, and made us write an extra essay about it. But what does Wonder look like? Must it be Awe, clearly written on your face, or can it be curiosity?

Wonder must be felt, it cannot be forced or faked; likewise, curiosity. There are many instances when fake it till you make it applies, but performing wonder or awe or curiosity for someone else I suspect prevents it from being felt. Someone else cannot tell you an experience is meaningful; you assign your own meaning. No one else can be curious on your behalf; you must find your own curiosities.

You can create conditions more friendly to experiencing the emotions you seek, but the emotion is not guaranteed. Place is one way to prompt connection with the past, but having expectations of emotional meaning makes it easier to disrupt. We got up early to run the track at Delphi; the landscapers were there too, leaf-blowing. The modern din forestalled a bond with the priestesses of yore. Likewise, too much intent strains curiosity; it is an invitation to be followed, not a certain path. Expecting a direct trail keeps you from seeing the cairns and blazes marking a way off to one side, or reading the topography for the easiest passage.

I like this encouragement to indulge my curiosity because sometimes I’ll be intrigued by something, then remind myself I have no reason to learn more about it or save it because there’s nothing about the information that’s relevant to my life or work. And sometimes that is true, but practicing curiosity inculcates that perspective in your thought habits, making it easier to be curious about more things.

Is the same true for wonder? Were we not trying hard enough to feel it? Is it a state of mind that practice can bring you to more readily? Both Wonder and curiosity require openness and humility, but feeling Wonder also takes vulnerability. Curiosity, in contrast, needs an acceptance of inefficiency. These additional demands may make one more challenging for some to feel than another.

In Egypt, I doodled motifs from the walls of an ancient tomb — sketching and photography were my way of absorbing what I was seeing. Curiosity is an active engagement that adds to what exists, ciphering it through the self; Wonder is a receiving and a changing of the self. Curiosity seeks to unravel the mysterious; Wonder values the mysterious for itself. Constitutionally, I am more suited to curiosity than Wonder.

Categories
Activism Future Building Personal Growth Relationships

Allow room for allies to make mistakes — because we all make them

Replied to we will not cancel us by AdrienneAdrienne (adriennemareebrown.net)

We hurt people.

Of course we did, we are human. We were traumatized/socialized away from interdependence. We learned to hide everything real, everything messy, weak, complex. We learned that fake shit hurts, but it’s acceptable…

Canceling is punishment, and punishment doesn’t stop the cycle of harm, not long term.

We will be accountable, rigorous in our accountability, all of us unlearning, all of us crawling towards dignity. We will learn to set and hold boundaries, communicate without manipulation, give and receive consent, ask for help, love our shadows without letting them rule our relationships…

Shaming and condemning mistakes simply makes others less willing to try or speak up, and less willing to admit their mistakes. This has a chilling effect to keep people in line with what the loudest have decided is right, even when there are valid arguments for other perspectives, and hardly encourages relationship building across identities and ideals. Righteousness is just as unhelpful from the liberal corner as it is from the conservative.

You can hold people accountable without being a dick about it. Not to tone police, but sometimes people on social media talk about others as if they aren’t a person too, and the intensity of condemnation feels greater than the sin. “Nice” is bullshit, but you can be kind and critical.

I guess I’m pretty sympathetic to the choices workers feel they have to make to survive under capitalism. I’m thinking of a disabled person who was cancelled (I think in 2022) when it came out they worked at a military research company for the health insurance and flexible work conditions. Some of the cancelling might have been because they had cancelled others in the past? But that’s just perpetuating the cycle. I’d rather see the conversation “and this is why you shouldn’t cancel people folks” than a dog pile of shame.

We talk about how there is no ethical consumption under capitalism, and I wonder how much that extends to our lives too. It very much becomes a judgment call of what crosses the line to be part of the progressive tribe: ok, working at military companies is out, how about tech companies? Is it only bad if you work on Google’s military contracting projects, or is the whole company tainted? What about working for social media companies that sell our data, if you’re in another department? Is working in anything involving marketing out? How about owning a house, knowing the land was stolen from indigenous people and our treaties with them not honored? How about owning an iPhone knowing that conditions are so bad workers commit suicide on shift? Is catching an Uber home from the airport ok? Is flying ok? Shopping at Amazon?

It is easy to judge others’ choices but defend our own hypocrisies and compromises, instead of helping people recognize the harm they are causing, acknowledging and making up for ours, and working to change the systems that force people into hard choices (like advocating for universal health care and fixing our disability qualifications). I’m not naming or excusing any choices here, and also not supporting social punishment.

The lines I draw for myself are different from everyone else’s, and may change over time with my perspective and circumstances. I have the privilege of owning a home, being married to someone whose job gives me health care, and having a good savings and no student loans; that gives me the ability to make choices others cannot. Physical ability, wealth, family support, obligations and debts, and mental health all shape our decisions.

Maybe some of my perspective here comes from years of being a vegetarian. Many people took my personal calculus to be a judgment on their choice to eat meat, but it really was a personal decision; all I wanted from others was for there to be literally anything I could eat if they hosted (I usually just brought something). Now, for a variety of reasons I’ve become pescetarian. Maybe one day I’ll go veg again, or maybe someday I’ll start eating meat 🤷‍♀️ We can’t necessarily predict how our circumstances will change our choices.

Systems of oppression and those in power acting unjustly should be the main targets of action, while we offer solidarity to workers doing their best to get by, even if sometimes they screw up. It’s easier to attack or ostracize a nobody than to speak truth to power — but it’s a poor outlet for emotional pain and frustration. Purity and ideological perfection are dangerous social concepts, and I would rather have people feel safe enough to make mistakes than withdraw from community for fear.

See also:

The addictive nature of Twitter

Cancel Culture

Categories
Getting Shit Done Meta Technology

Use different tools for creation and consumption

Replied to

I just realized I have mostly  migrated consumption to my phone somewhat unintentionally — but because I read articles on my phone I also tend to compose my commentary on the phone as well, even though typing on my phone sucks 😂 The editor is also hard to use on my phone, and cutting and pasting doesn’t work correctly, so I edit less than I might on desktop. On my phone, I can only see about two sentences at a time, making it harder to write longer form work.

How much does the tool shape what content people produce? Considering many people no longer have desktops and solely use phones for computing, does lacking a PC deter them from writing? How much of the shift to video is because it’s simpler to film than type on phones? How much is the rise of microblogging and descent of blogging tied to smartphones?