An exchange between myself and my wife, about someone else we know:
Her: It’s hard to say no to her.
Me: No it’s not. It’s two letters. One syllable. Simple. Done.
An exchange between myself and my wife, about someone else we know:
Her: It’s hard to say no to her.
Me: No it’s not. It’s two letters. One syllable. Simple. Done.
I have given up on Reeder 5’s native RSS feed handling with sync via iCloud. I had expected my devices to keep in sync, and for Reeder to sync in the background rather than whenever I started it. I think I asked too much from it. I use three iOS devices each day, and Reeder was showing me stale content on all of them at various points each day. I suppose local device syncing of 75 feeds was a bit too much for it to handle quickly and smoothly. I moved my RSS subscriptions back to Feedly and am once again satisfied with multi-device support. I think I have learned that a web service works better for me, which probably should come as no surprise.
It’s a uniquely American thing, I have read, to work while on vacation. I have had to, as recently as last year. Some of my coworkers were reading, reviewing, and revising work papers and memos while on vacation last week. My vacation this year hasn’t technically started yet, but I am now in my family’s summer-vacation spot, my in-law’s lake house, primed to report in for remote work in the morning.
Next week, for me, is planned full-time, remote work, while on vacation with my family. I hope to have no unplanned work time after next week, when my vacation-proper finally begins. That kind of thing has happened in the past, and it could happen again this year.
I feel guilty each year for taking off two weeks with my family and finagling a weeks’ worth of remote work to tack onto it. It seems near impossible and nigh immoral to take off such an extravagant length of time from work. The idea is coming entirely from my own head at this point; no one explicitly tells me I’m doing something wrong, but it certainly feels like I am. I also wonder if my vacations are taking away from my opportunities for promotions, raises, or bonuses. In the end, it is more important to me to spend vacation time with my family, even though it feels like I am stealing that time from work.
I missed David Rothkopf’s article, “We Still Won’t Admit Why So Many People Believe the Big Lie” onThe Daily Beast, when it was published a couple weeks ago. He asks why some people believe the “Big Lie” that Donald Trump really won the 2020 presidential election. His answer is an obvious but largely unspoken truth:
…even the most modest amount of analysis and introspection will reveal that buying into the nonsense peddled by the former president and his clown college of cronies is not an aberration, not due to some momentary lapse on the part of the American electorate. We were raised on lies—including many lies that are much, much bigger than the big one that troubles us today.
Society asks us to believe all kinds of lies. We know this. We participate in it. And, with the exception of some period of adolescence when you figure out that the adult world is full of bullshit, we never rail against it or even really think about it. But we all know, in some ways, lies hold together all of human society. Don’t be surprised if people believe them. Don’t be surprised if people know they are lies and still believe them.
I am neither surprised or disappointed by Apple’s impending pull of “iDOS 2” from the App Store. The whole point of buying Apple hardware has always been to buy into a unique ecosystem: the “walled garden.” While the Mac has never been locked down to Apple’s Mac App Store, the iPhone has always been locked down to its App Store. It’s easy to forget now, but the iPhone grew out of Apple’s prior consumer electronics smash hit, the iPod, which was was completely locked down. It didn’t have an App Store. Neither did the iPhone at first, either.
I hate to side with one of the world’s biggest companies here, but I totally believe that the iPhone is a console, as Steve Jobs described it. I knew that going into the iPhone ecosystem, and that’s actually what I wanted, and still want, from that ecosystem. I want an apps console that (for the most part) just works, and doesn’t require a lot of my time and effort to work smoothly and securely. I came at this from the other wide: an Android users who jailbroke and hacked his phone into something completely different than what its manufacturer and mobile data provider wanted or intended. The thing is, all that customization led to a system that was unstable, and I had no idea if it was secure at all, because the code (apps and OS) came from a bunch of different places. I just had to trust, blindly, that everything was OK. The iPhone imposed guardrails on my hacking activities—guardrails that I wanted, because what I was doing wasn’t working for me anymore.
I think a lot of people chafe at the idea that their most useful device is a console because we reserve that word for entertainment devices like the Nintendo Switch and the PlayStation, or the humble cable set-top box. It doesn’t matter, though, if the iPhone is more useful and more important than a video game system, though. What matters is how it is sold.
It isn’t exactly a secret that normal customers can only download iOS software from Apple’s App Store. Beyond hardware, access to that App Store is the fundamental thing being sold by Apple. Customers should know it when they are choosing a product. I doubt any of the complainers and hand-wringers commenting on this article on MacRumors didn’t know that going into their iPhone purchase.
I would love to run Windows games on my Nintendo Switch, but I can’t because it is a console and Nintendo does not allow it. That doesn’t surprise me, or anybody else for that matter. The situation is not really different than the one with the iPhone. If you want to run DOS on your mobile phone, the far-more-open Android universe is there for you—and it’s the most dominant OS platform in the world, too. Vote for it with your wallets and your time.
I like the new name of Cleveland’s Major League Baseball team. I think the team’s old name, the “Indians,” and especially its already retired Chief Wahoo logo, were problematic and absolutely needed to go. If you find the term “Indians” as applied to a ball club to be inoffensive, know that that merely means you are not among the ones who are offended by it. I am sure that, at best, a lot of negative stereotyping went into the choosing of that appellation.
I have heard that some Cleveland fans wanted to the team to go back to one of its old names, the “Spiders.” I think that would have been a mistake. Spiders may be fearsome, but they are not widely loved and are easily squished. (I find them icky, even though they are theoretically fascinating, but that’s just me.)
The best part of the “Cleveland Guardians” name, in my opinion, is its positivity. After all, to guard the city is noble and brave. It is, however, a generic word that concretely identifies nothing in particular, so it is understandable that some of their fans may be disappointed.
The team needs a memorable mascot and logo to make whatever a “guardian” is seem strong and cool. After all, what is a “guardian?” I have most often heard the word used in the phrase “parent or guardian” which is a lot less exciting than, say, “guardians of the galaxy.” (I have tremendous respect for people who are guardians—those who take care of, and look out for the best interests of, children. That really is heroic.)
While I like the name “Guardians,” I will admit it is not the best name for a baseball team. The best baseball team name is the Hartford Yard Goats, but that name was already taken.
Tonight I learned my son has a LinkedIn profile. What’s more, he works for Microsoft.
My son is four years old.
My family is at the county fair this evening. It is a little hot but we are having a fun time. The kids love the rides.
I’m going to be driving most of the day tomorrow, so, via Ulysses, I’m scheduling tomorrow’s blog post to be published tomorrow. I am not sure if the post hits the Micro.blog timeline, though. If it did, I would probably use that feature more often.
Panic’s bright yellow pocket game system, the Playdate, looks cheerful and cute. I want one, plus the blocky stereo dock, just to put on my desk like other people place toys and figurines. Ars Technica published a review of the hardware and a preview of some of the games yesterday, which whet my interest. I don’t know if it is worth the money for me to have a geeky object d’art for my workspace, though.
Elizabeth Nelson wrote the best, most accurate take on Ted Lasso (the character) that I have ever read:
Unceasing optimism defines Ted Lasso. But roller-coaster mood storms, manic reveries, and seemingly deliberate head games also define Ted Lasso, the players’ coach, and make him one of the best and most-layered characters of the peak TV era. He’s a man who presents himself as two-dimensional, but who might actually be playing three-dimensional chess. We delight in his antics, marinate in his charm offensive, and celebrate his offbeat approach to winning the whole fucking thing. But at all times, there’s a slight worry, one that crops up in the back of our minds, about what he might be willing to do to make it happen.
People love the show’s positivity, but it also has a dark side that actually makes it good. There’s something just a little off about Ted Lasso, and that’s what makes him interesting.
While Ted wins over a bunch of potential enemies in England, his wife back in Iowa can’t stand his relentless positivity. Maybe you couldn’t either, if you were married to him.
Ted excels at darts because he spent every Sunday in a sports bar with his dad between the ages of ten and sixteen, when his father passed away. I don’t think it is normal to bring your underage son to a bar every single week. That isn’t exactly a normal childhood. Furthermore, losing a father at sixteen may have caused some emotional trauma that Ted buries deep, causing him to overcompensate with cheery paternalism in almost every interaction.
Most importantly, Ted fails. He failed at his marriage, he is trying and largely failing to be a good father, and he failed to produce a winning record for his team, or even keep them in the Premier League. As appealing as Ted is to the TV audience, what he is doing is not working—or not working yet—in the world of his story.
What made Season 1 great was the surprising complexity of what appeared to be a dumb, one-joke sitcom based on a commercial that hardly anybody remembered. Season 2 starts to air tomorrow. I hope the writers didn’t forget about the dark undercurrents that made their show about a nice guy work so well.
If there is one thing I would like my kids to learn from me, it is that one of the greatest, most impactful human inventions of all time is the double-entry bookkeeping system for accounting. If my son or daughter ever throws that term out in class when a teacher is writing a list of inventions on the board, there may be some “ums” and blank stares in the classroom, but I would be very proud.
Tim Harford reminds us of how important the double-entry system is, and of the system’s historical origin, in his essay “The Tyranny of Spreadsheets”:
In the late 1300s, the need for a solid system for accounts was evident in the outbursts of one man in particular, an Italian textile merchant named Francesco di Marco Datini. Poor Datini was surrounded by fools.
“You cannot see a crow in a bowlful of milk!” he berated one associate. “You could lose your way from your nose to your mouth!” he chided another.
Iris Origo’s vivid book The Merchant of Prato describes Datini’s everyday life and explains his problem: keeping track of everything in a complicated world.
Some of us have always needed to keep track of everything. The knock-on effect of accountancy, going back to antiquity, is that someone wrote down what people owned and traded, what was abundant and what was rare, and even what came from where. Because of that, we have been able to learn the languages and numeric systems people wrote with, and what people valued, how they traded, and much more, in cultures that have long since passed away. It isn’t all because of the double-entry system in particular, but I think it is pretty interesting nonetheless.
When I was a kid writing essays for school, I always peppered my writing with big, fancy words in places that small, simple words would do. In part, I was showing off my big vocabulary to my teachers. For the most part, though, I was afraid to use the small, simple words because I thought of them as childish. I figured that small, simple words would make my writing seem small (as in insignificant) and simple (as in simplistic) as well. Of course, now I see that as folly. An argument that is simply stated is easier to understand, and thus more convincing, than one delivered in showy, but less comprehensible, language. While there are some glorious, complex words that I would never want to do without—incarnadine, mellifluous, loquacious, raconteur, elixir, *donnybrook*—I would never want an argument I write to hinge upon them.
My new grill was delivered today, but it arrived damaged and I had to refuse it. No grilled hamburgers for me today, I guess. At least I noticed the dent and scuffs before the installers got far along with its setup. Hopefully a replacement will come before the end of the week.
I read this quote years ago in a novel called The Gargoyle that fundamentally changed my life:
Love is an action you must repeat ceaselessly.
Love is an emotion that does not meaningfully exist unless it is expressed through action. Actions are governed by choice. Thus, to act with love is a choice you have to make every day, many times a day. It doesn’t matter how you feel: love is what you do.
Today we are going to a family birthday party, then restocking the fridge for a week’s worth of camp- and work lunches.
It is popular now to lionize failure. Both fictional stories (thematically) and nonfictional accounts (explicitly) praise failure as a stepladder to greatness. It makes sense. Failure is a great teacher. It teaches us what does not work. It teaches us what we need to do better, and what we need to work harder on. It teaches us what our limitations are, and prods us into further developing our strengths.
While this is, in some way, timeless wisdom, the overt emphasis on it in the business book, personal productivity, and self-help space is relatively new. Our idols in business—wildly successful, and sometimes unimaginably wealthy, people—now regularly explain that they arrived at whatever monumental success they are famous for only after navigating a bumpy road potholed with failures all along the way. They tell stories about companies they started that went under, or products they developed that never found a market, or ideas they had that did not work with as much or more pride and excitement as when they talk about what actually did work for them. They talk about failure almost like it is success. They prod us to consider failure as the best learning opportunity—something to embrace rather than fear.
Well, sure, if you can handle it. For most people, it is a privilege to be able to fail. You have to be able to weather the storm. Not everyone can. It takes resources, like money and the right friends. It takes mental wherewithal, like good mental health (or, in some cases, psychopathy) provides. It takes opportunity, which for normal people often comes down to luck. Not everyone has those things. Not everyone who falls down can pick herself back up again. Sometimes a door closes and another one does not open up.
When serial entrepreneurs, angel investors, or CEOs wax on about their failures, be wary. Their discussion of past failures may be instructional, but it is also a showy way to appear humble. It disguises the privilege one often needs to survive failure without economic, emotional, or reputational humiliation.
Money and connections help some people fail upward. Their unsuccessful companies don’t liquidate; they get bought by other companies. They don’t go bankrupt when a business venture fails, but their employees lose their jobs and their investors take a hit. These exits are not failures; they are minor successes—at least for the owner or entrepreneur. The road to their success is not paved with failure. It actually is paved with minor success after minor success.
So don’t believe it when sometime tells you that experiencing failure is the only way to learn how to succeed. That simply is not true. Failure is a great teacher, but it is not the only teacher.
I think about education a lot, not only because I have kids, but also because my wife is a high school teacher, and I like to help her think up lessons. If I were a teacher, I would challenge my students to think really hard, to question what they have been told, and to go beyond the material and create new thoughts of their own based on what we discuss in class. My wife tells me that my level of instruction would suit some grad students, maybe, but not high school students, and presumably not undergrads either. That is dispiriting to me, but she would know better than I—she knows teens and young adults and intimately knows the educational system they are put through. Through her, I’m pretty aware that high school is not quite as I remembered it. Overall, it seems dumbed down compared to what I went through: loftier goals with lower expectations for achieving them.
I had figured that college was just the same as I remembered it—and thus would be very difficult for high school students to cope in, academically—but I am now not so sure about that. A few weeks ago, I read Greg Lukianoff and Jonathan Haidt’s Atlantic article “The Coddling of the American Mind.” It riled me up. I had seen Mr. Haidt sometime earlier discuss his thesis about the decline of thought on college campuses in favor of “safetyism” on Real Time with Bill Maher, and found it both compelling and dispiriting. I listened to one of his lectures on YouTube and thought about it quite for a while.
I went to Brandeis University—one of the schools mentioned in Lukianoff’s and Haidt’s article—in the 1990s. I remember that one of the main themes of my instruction there was that I should expect to be challenged and to confront new and potentially uncomfortable ideas. We, as a student body, we asked—explicitly—to learn about, discuss, and work through new ideas together. It wasn’t supposed to be easy. It was supposed to be difficult. Difficulty was the whole point. Open debate was how it worked. That was liberal arts scholarship—and I’m pretty sure college campuses were thought to be politically-correct, liberal romper rooms even then, so maybe even I got the watered-down version. Is the academic culture I came up in gone now, in favor of a focus on comfort and mollify action at the expense of debate and rigor?
Lately, I have wondered how it really is on college campuses. How much of this worry, put in me by some of the things I read and see, is about someone else’s hysteria about changing cultural and social norms? Are the stories about wokeness, safe spaces, trigger warnings, microaggressions, and call-out culture that I have seen on the Internet and in The New York Times the norm, or exceptions to the norm? What kind of story would make it all the way to me? Only the exceptional ones, I think. Maybe freedom of thought at college campus isn’t as constrained as I have been led to believe. I suppose I will find out when I explore this issue further, or years down the line when my kids are ready to look for the colleges they will attend.
The Steam Deck looks fantastic. It’s like a Nintendo Switch for Steam games. Considering I wrestled with a Windows laptop for 8 hours to get it ready to play Steam games just days ago— and had the whole thing fail, the very next day, to understand how to work with the very Xbox 360 controller I was using the day prior—I never want to bother with PC gaming again. It’s too fiddly and too expensive. But this device might be easier to set up and more performant than the $2,000+ (but not specked for gaming) Windows laptop I already have.
I have been having a lot of trouble finding books that excite me lately. Over the past month, I read through the first Grishaverse novels trilogy (staring with Shadow and Bone) and found it only OK. (It could have been a single, much faster-moving novel, I think.) I started Leviathan Wakes, the first novel in The Expanse series, and couldn’t get into it. Most recently, I started Black Sun by Rebecca Roanhorse and, after reading a few chapters, concluded that it is not for me.
This blah feeling about well-regarded sci-fi and fantasy books surprises me. I used to love those genres. Now, when reading them, I find myself distracted by details that don’t make sense, like: If the planet the story takes place on is much larger than Earth, then it would have much a stronger gravity field, and all life on it would have evolved differently (i.e., be sturdier and stronger), and humans wouldn’t be able to move as easily on its surface. I can suspend my disbelief to quite a big extent to accommodate an alternate history, a magic system, or a populated exoplanet. However, for the story to be satisfying, the secondary effects of these fantastical elements have to make sense and properly serve (rather than work against) the story.
As a young adult—after earning my English degree, and until I had kids—I only read classics. Back then, I was looking for books I could get or borrow for free, and for novels that made me feel like I was continuing my studies in literature. I strayed from that path after a while, as my professional life became more technically-oriented (I read books about programming and systems design) and as I went through business school (I read about business, economics, sociology, and so on, to understand how the world worked). I started reading for escapism (mostly) and for learning (sometimes) several years back, but I am finding it hard to get interested in anything any more.
I think that means that I am now ready to go back to reading classics. To that end, I started James Joyce’s A Portrait of the Artist as a Young Man last night. I love it so far. It is formally challenging, full of ideas, and grounded in the real world. I know how to tackle difficult books, and plan, at this point, to read Dubliners after I finish Portrait. I’m looking forward to it.
I decided to upgrade my Windows laptop to Windows 11 yesterday. It was an awful slog, replete with many failed Windows Update attempts, permissions issues that prevented me from changing the necessary diagnostic data sharing settings, and even a hardware firmware update. I ended up having to “reset” Windows—which deleted all my apps, data, and settings—to move forward.
However, hours later, once Windows had finally gotten all of its updates in, I had the opportunity to use Winget for the first time. Winget is Microsoft’s command line package manager. Although not as mature as the package manager I am most familiar with, APT, Winget is pretty awesome. I was able to use it from the Windows CMD shell (because Windows Terminal does not come pre-installed) to install everything I needed, from Visual Studio Code to Vim to NextDNS. It saved me a ton of time. If I set up new Windows machines frequently, I could write a script to automate all of it.
It is interesting to see all the developer- and administrator-friendly enhancements Microsoft has made to Windows over the past few years. Windows Terminal, Winget, and Visual Studio Code are miles ahead of what Microsoft offered just a few years ago.
These are important ideas worth looking into and thinking about. An idea from another outside source—the Make Me Smart podcast—came to mind when I read the post: “None of us is as smart as all of us.” (I just learned that the Internet attributes that quote to Kenneth H. Blanchard.)
I have been an Apple Music subscriber from the service’s very first day. I was an iTunes Match subscriber before that. I have been pulling an enormous electronic library of albums and songs forward with me since I was a teenager. We’re talking about thousands of albums, dozens of playlists, and a lot of complications that have stuck around in my library since the CD days.
Apple Music gobbled up all that music info and turned it into a frustrating mess for me. I have duplicate albums, duplicate tracks in my albums, unplayable tracks in some albums, and most of my library does not play in Apple Music’s new lossless format. A lot of music playing for me takes place outside my library, but that is in large part because my Apple Music library is an awful mess.
Today, I just couldn’t take the jankiness anymore, so I took the nuclear option: I deleted everything from from my Apple Music account. All my music is gone. I also turned off the “add songs from playlists added to library” option, which was inconsistently applied amongst my devices, and peppered my library with useless, one-track albums. I added some of the artists and albums I actually listen to now (as opposed to many years ago when I bought a used CD somewhere), and will rebuild my collection from there.
Despite my nuking of my Apple Music Library, None of the music I bought in the past is really gone. All the CDs I actually wanted to keep are ripped to a well-organized, if rarely touched, folder on my NAS. I will probably continue to ignore this, to be honest, and focus mainly on what’s new.