Monthly Archives: June 2020

Social Media Shaming Versus Guns

Yesterday two wealthy middle-aged lawyers in Saint Louis were so terrified of a group of Black Lives Matter protesters who marched into their gated community that they stood on their front lawn in front of their gaudy mansion brandishing their guns.

I suppose the protesters were guilty of trespassing, but rarely has one image so perfectly summed up how Americans look to the rest of the world.

Donald Trump and the right-wing media have already started putting this couple up on a pedestal, calling them heroes.

In the short run,  I suspect that both of them are going to learn that the soft power of social media shaming is more powerful than an AR-15. Who exactly wants to hire a lawyer so paranoid and hateful that he, or she, would murder people for trespassing? A confident ruling class has no need for threats. It rules by propaganda, by its ability to convince the rest of us that it’s deserving of its power and privilege. These two look ridiculous. I suspect that even among their fellow bourgeois elites they’re going to be very lonely for the next few years.

In the long run, however, this is a sign of the ugliness to come.The (mainly black and brown) working-class is starting to feel more confident about violating the gates and the boundaries society has set up to keep them down. A lot of rich, hateful white assholes are just dying to shoot somebody. It’s also why I’m a rare person on the far-left who’s in favor of gun control.  Absolutely nothing good can come of any of this.

Arrest Trump

The Iranian government has issued an arrest warrant for Donald Trump.

Iran has issued an arrest warrant and asked Interpol for help in detaining US President Donald Trump and dozens of others it believes carried out the drone strike that killed a top Iranian general in Baghdad.

https://www.aljazeera.com/news/2020/06/iran-issues-arrest-warrant-trump-asks-interpol-200629104710662.html

The American corporate media is going to treat this as a joke. After all, the only people who get prosecuted for war crimes are Serbians and Africans. But the Iranians are well within their rights to demand justice against the criminal government that ordered the assassination of one of their top generals. The Democratic Party and the Clinton crime family have demonized Russia for years simply because of an alleged hack of (idiot Boomer) John Podesta’s email. Can you imagine how the American government and media would react if the Iranians ordered the assassination of one of the Join Chiefs of Staff?

So end the double standard between the Global North and the Global South. Someone make a citizens arrests of Trump and lock him up. It would solve (many of the) problems of both Iran and the United States.

Reading Video Games

VIDEO GAMES AND JUNK CULTURE

Having largely ignored them for most of my existence, I finally came around to video games a few years ago and have since been exploring the canon, mostly with an emphasis on things made before the year 2000. My interest was initially academic-I’ve been writing a long manuscript on the history of TV for some time and it seemed like any manifesto on the nature of TV that didn’t acknowledge video games was going to be woefully incomplete.

The attention paid to video games is odd in comparison to other 20th century mediums-whereas everything from cinema to broadcast TV to comic books eventually found a community of people willing to discuss them intellectually, not much on that front has been done with video games. And while this is pretty common in what’s still a fairly early time for a medium considered to be disposable or low culture, this doesn’t help somebody trying to write about them. Or rather, its fun and exciting in the sense that there’s so much to cover, but that nagging insecurity is still there that any salient points I get to will just work as forgotten stepping stones toward a more developed or advanced theory.

Video games differ substantially from prior mass media forms in numerous ways. Unlike other media, you by and large are not in control of the level of engagement you need to have to get something out of it. I can put an old movie on in the background and the movie will play whether I’m paying attention or not. Presuming the mixing was done competently, the only buttons I need to hit are to turn on the TV and DVD player and then hit play.

This need for engagement stemming from the initial distribution model of quarters for play time makes the medium both more and less mentally stimulating. On the one hand, every game that can be beaten is, on some level, a puzzle game-even something like Super Mario Bros mixes large amounts of strategy with hand-eye coordination. And even a game that can be beaten without strategizing much can always be beaten better in some way. In this sense, games require more active thought than most things. On the other hand, this thought is confined to the arbitrary parameters of something designed entirely for immersion-video games as a medium have been more resistant to a “realism” movement than any other medium I can think of. Obstacles are simple and unlike in real life, one is assured they can be overcome with the right answers, answers that relate heavily to other video games but don’t interact much with the world outside video games. Like Euclidian geometry they are a set of rules that are internally consistent but untouched by nature.

Another appeal is the simulacra of unfettered movement and unimaginable power without consequence-the appeal of a dream where one is flying. The body is both immobilized and immersed-the eyes, ears and hands are all actively engaged in an activity that punishes you for letting your mind wander. Tellingly, my girlfriend who has little experience playing games always describes her frustrations with their difficulty thus: “It feels like one of those dreams where I can’t get my body parts to do what I want them to.” At the same time, this flying dream appeal is necessarily limited by the complications needed to establish an effective psychological rewards system to encourage people to keep playing. I can run as fast as I want to, but if I touch the wrong thing I die and am reborn. The world of speed running then becomes one of layered dreams; the fantasy of escaping better, of a zen merger of the inherent you-game duality.

The need for near-constant interaction also limits the extent to which games can function in a didactic role the way films and literature and even comic books frequently do. It’s far more obvious and feels far more ridiculous when a video game is telling me about saving the environment than when I’m watching a documentary that’s literally just talking at me about the same things. No one has ever made a successful “game polemic” and understandably, no one really wants one. A polemic implies a person speaking (or writing or whatnot) and a person or persons listening and the polemic’s power comes from the speakers position as not being the listener. A video game works on a collapse of that dynamic. Unlike any prior mass media form, a video game implies a breakdown of the consumer/producer dynamic, as is evident from the enormous competitive gaming and streaming scenes.

What is especially fascinating about the breakdown in this dynamic is that suddenly enormous numbers of people who would balk at say an art film making them work to get anything out of it will staunchly defend the difficulty of a video game, and people who’ve spent their time learning to read other forms of mass media in depth will frequently avoid the medium altogether for the same reasons in reverse.

Like the other major artistic mediums to come out of the 20th century going back to jazz, its early development being shielded from academic consideration may have been for the best, allowing it breathing room to go in its own direction. Thankfully, relative to other 20th century media, most of the early history has been preserved in some form, usually a form that’s pretty easily accessible, especially if you’re willing to spend a few dollars on a console and a flash cartridge (a thing that looks and acts like a video game cartridge but reads its data from an SD card instead of a flashed rom chip or optical drive). While I’m sure there are games that are lost (a few SNES Sattel, and prototype games on unlabeled cartridges from the 80s and 90s seem to pop up every few months, that’s not a bad track record compared to the 90% of silent film and probably 98% of early TV (and 99% of the early internet?) that are completely lost barring the introduction of a time machine. This spirit of preservation in the retro gaming community is one of the things that sets it apart. The fact that the vast majority of games were home releases and not broadcasts or performances helps matters greatly. Software can also be preserved in 1:1 copies and with advances in FPGA hardware emulation it seems likely that the hardware itself can live on in a similar fashion, the soul of the machine transmigrating every few years to a different system on a chip. The rapid advance of flash cartridges and FPGA based clone consoles represent one of the most important advances in cultural preservation in recent memory, given the highly ephemeral nature of computing hardware.

However, in preserving the experience, these also change the experience. Being able to pay $40 and have every Sega Genesis game at my fingertips is not the experience people who owned a Genesis when it was current would’ve had-games were very expensive, and having bought out peoples’ collections, on average the most intense fan of any given console still would only have 40-50 games at most unless they went on a buying spree when the stuff went on clearance. Games that seem to be difficult now were probably seen as having a good consumer value at the time since you didn’t want to pay $60 for a game then finish it in a day. This also added to the emotional attachment-to finish a difficult game brings that adrenaline drip of having accomplished something. You have to become familiar with each nook and cranny intimately or else you’re not allowed to move forward; in film you’re pushed forward in time regardless. It’s not that strange to attempt a video game level 15-20 times but its considered fairly strange to have seen any single film 15-20 times.

Games have a tendency to wander into what would be considered the extreme avant-garde in the film world. Making a film without content, a “pure film”, an obsession of the 60s structuralism movement, was achieved quite early in video games and with none of the attached friction. In the cinema, asking people to emotionally engage with geometric shapes devoid of context is seen as a challenge to the viewer and the norms of artistic consumption and production; in video games its just called Tetris.

And even in games that could considered to be at least somewhat closer to a traditional narrative, something like say Super Mario Bros, we’re still treated to a funhouse mirror version of the world ruled by what pleases the principles of industrial design. The introduction of consequences and a simple punishment/reward system makes it quite simple to suspend disbelief at a short plumber fighting over a girl with a deformed half-turtle half-dinosaur through a world of mushroom shaped things that either kill you on contact or make you grow to twice your size.

Like many former “low culture” media, there is a freedom that comes with a public’s inability or unwillingness to engage critically, and like prior “low culture” media, that capacity can be used for good or bad.

This makes games incredibly difficult to translate into film-the demands of each medium are diametrically opposed. The things that might make an interesting film tend to make a terrible game and vice versa.

Would I love to see a movie of Mario finally defeating Bowser and getting to be with Princess Peach only to discover getting the girl is the easy part-the true challenge is sustaining a marriage-that his true love was the pursuit and not Peach? Yes! Of course I would. There’s so much there. Mario seems like someone perpetually thrilled by conquest with no sense of the domestic beyond the pipes beneath a double decker ranch home.

Nintendo, if you’re reading this and looking to lose another $40 million dollars on a second Mario Bros movie, I would make that in a heartbeat.

But would I want to play a game based on that premise? No, I wouldn’t (though I suppose some of the more cynical among us might presume that’s the backstory to at least part of Super Smash Bros.). The video game understands that Peach is a MacGuffin.

THE ROOTS OF VIDEO GAMES

In trying to find what defines a medium in opposition to other mediums, its generally useful to go back to the maxims set out by Marshall McLuhan in Understanding Media. Particularly salient here is his assertion that “the content of the new media is always the old media”-the content of early cinema mimics the stage play and the point where cinema comes into its own is almost always defined as the point when it breaks off from those roots.

So what is the “old media” that provided the basis for the first video games? The most obvious answer would be childrens’ games and casinos. The “?” boxes in Super Mario have that randomized reward thing going on like a slot machine. The other mechanics of the game resemble tag, much like Pacman and the hundreds of clones of Pacman out there like Devil’s World. Even a game as story and narrative heavy as Metal Gear Solid takes its basic mechanics from tag and tag’s weird nephew paintball, and the narrative, while skillfully constructed and quite thoughtful by game standards, still has to act primarily as a laundry line between situations where you’re playing tag with an imaginary gun; any substance to the narrative outside the experience of game play itself is gravy.

And then of course, the first 5 or 6 years of home consoles were dominated by what are called “dedicated consoles”, i.e. consoles with the games built in and no tech included to run other software-similar to contemporary “plug-n-play” devices like the SNES of NES classic editions that came out a few years ago. These consoles invariably contained simplified simulacras of tennis, ping pong, and other popular sports like hockey or basketball. Sometimes these weren’t even separate games but the same game with different transparent overlays you’d put over your TV to make it look more like ice hockey even when the gameplay is still identical to Pong. The earliest games then were defined by a combination of what was considered athletic leisure at the time and the severe limits of what early computers could do.

In the next generation beginning in the 80s, the lightgun game becomes very popular to the point many consoles included one as a pack-in. The most famous example is Duck Hunt-you take a plastic “gun” that shoots infrared light and it detects by the light bouncing back whether you shot at the TV in the right place. One wonders how the vibe in Graceland’s basement would’ve changed had Elvis lived to buy an NES console, being that he was probably the first person to pioneer using firearms in conjunction with CRTs. Maybe we would’ve gotten a hot pink Zapper.

Duck Hunt’s simplicity makes it a good one to analyze, though most of what I’m saying here could apply equally well to other early light gun games like Hogan’s Alley or Bill Barker’s Trick Shooting. Despite the more direct antecedent to the light gun game being mechanical pre-video game arcade machines that used guns that shot light (these date back to the 1920s), the gameplay of Duck Hunt is still centered around 19th and early 20th century ideas of bourgeoisie leisure-you go out with your faithful basset hound and shoot ducks or clay targets in the woods. The others take pains to resemble carnival shooting galleries. That the light gun was so integral to the normalizing of game consoles in the home is even more interesting when considering the first prototype ever made of a TV remote had the form factor of a pistol.

What is it exactly about TV that makes one want a gun so badly? Why did the inventor of the TV remote, forced to respond to the novelty of his discovery like it was a Rorschach blot,  immediately think “pistol”? Perhaps the threatening qualities of the new technology might be mitigated in the minds of viewers by the repeated ritual of their staring down their sets at gunpoint-what could better reinforce that the TV is your subordinate? Like Joe Pesci, you point and say “dance”-it dances and doesn’t ask questions. You are authority-you bring law and order to the living room. He who has the remote becomes the sheriff of the home.

The lightgun is also the simplest of all video game controllers. The relative simplicity of even the normal NES controller required 8 input buttons-the lightgun only has one. Even the classic Atari 2600 joystick still theoretically has a whopping 5 inputs by comparison (up-down-left-right-fire). While this accessibility factor doesn’t help me too much in my theorizing, it should be acknowledged. Sometimes a cigar is a cigar, and sometimes something is just fun and accessible for reasons of mechanics that transcend cultural context. The relative failure of consoles with far more complicated controllers like the Mattel Intellivision would support this.

The Intellivision controller also highlights how important understanding McLuhan’s maxim was in the dog-eat-dog world of early gaming. For those who’ve never seen one, the Intellivision controller most closely resembles a very very early mobile phone like you’d see built into the back of a limo in an old time movie. It’s a Rembrandt-brown rectangle with a 9 digit number pad. This number pad has weird mushy membrane buttons sort of like some electronic cash registers or a debit card reader/ATM. The directional control is a circular cardboard wafer you spin around with your thumb sort of like how you’d dial a rotary phone. But the old media the new media was feeding off of wasn’t the telephone. Nintendo understood that, Mattel presumably thought making the thing look old and muted would appeal to the largely untapped market of adults because it looked so little like something a kid could give a crap about. They were mistaken, and it died a slow lingering death. Furthermore, Nintendo knew the way to the adults was through their children, not by making them feel like they were running an errand at the bank. The woodgrain finish almost made the Intellivision look too serious and dignified-it looked as if it had a full time job and no time to have fun with the user.

And while I would argue the roots in sports and leisure activities of the past was the primary “old media” games cannibalized for their vessel, the urge to include or adapt aspects of narrative commercial cinema arose as soon hardware was capable of doing so. I’m not talking about game spin-offs of films, but rather cut scenes (which at their pinnacle are usually described in the game press as “cinematic”) and point and click adventure games which would usually contain the plot of something that could’ve been a movie, wrapped in sprites with token bits of movement. While most of these were released for PCs and not consoles, they were still an enormous part of the mid-80s game market and mark a departure from earlier forms of gaming; these represent games shedding the necessity of their being defined in the negative-i.e. “it’s a game (at least in part) because I can lose.” Playing something like Snatcher for the Sega CD or Treasure of Monkey Island or the dozens of other games done in that style, you’re forced to solve a few puzzles but there’s no real threat of dying, just the threat of stalling progress within the game. You’re mostly just pushed through the plotline as if a DVD had merged with its menu. The limited motion in the images also suggests early 20th century comic strips before the universal adoption of speech balloons, Choose-Your-Adventure books marketed at young adults and their early digital counterpart: text adventures which developed contemporaneously with the Choose-Your-Own-Adventure books. Both owe much of their structure to early tabletop roleplaying games like Alan Calhamer’s 1954 game Diplomacy and of course the various revisions of Dungeons and Dragons which even resembles computer processing through its use of unusually configured dice to add a mathematical element of chance and spontaneity to the game.

 

TOYS VS FURNITURE VS APPLIANCES

The earliest TVs most resembled vanity cabinets and were meant to be integrated into the home as attractive pieces of furniture. This was due to the fact that you needed a large volume of electronics to run a fairly small screen and needed to put them somewhere, but also due to the fact they rose to prominence at the same time as US home ownership skyrocketed due to the GI Bill and the post-war boom. But as time and tech advanced toward using smaller or integrated components, and TV ownership became a given of the home as opposed to a status object, the aesthetics of TVs drifted from display piece to functional object meant to be as invisible as possible. The ideal TV of the present moment would be all screen with no chassis; the power trip of the remote control no longer registers as such and feels more like another technological hurdle before doing something in a world overrun with such hurdles. With some power comes some responsibility, and who wants that when you’re trying to watch TV?

Game consoles however, didn’t quite have a furniture phase, having emerged too far past the home ownership boom. Some manufacturers thought they were toys and marketed them as such-Nintendo famously sold people on the NES console after the great video game market crash of 1984 by selling it through the giant plastic Trojan horse of ROB the Robot which made it look like a toy more than the video game consoles everyone was pissed at after ET for the Atari 2600 came out (along with a lot of other unplayably bad 2600 games.) The US version of the console, the famous “toaster” model, was redesigned from the Japanese version to more closely resemble a VCR.

Further emphasizing their unusual hybrid nature, while every other appliance made in the period of the game industry establishing itself and its norms would strive over time for fewer and fewer buttons, culminating in the eventual complete elimination of buttons from the Apple Iphone, game consoles trended towards more and more buttons and joystick components until the most recent generation where I think most of the companies realized that people are confused and frustrated by anything with more buttons than a PS2 Dualshock controller.

Game consoles, due to their general parameters not having been defined yet through repeated practice, also serve as a fascinating study in the economy of stuff vs. space, which has been one of the defining cultural issues of our time. In less than a generation, the indication of status moved from having stuff to having space, and notions of physical size or volume of an object correlating on a scale with perceived consumer value flatlined. Being  rich “the right way” went from Charles Foster Kane’s Xanadu of boxed random stuff to Steve Jobs and his famously empty apartment, empty except for, of course, an incredibly expensive Tiffany lamp. In their time of flux, game console design went after both approaches with varied success-the NEC Turbografx 16 was so small that when a reissued “mini” version of it was released last year, they couldn’t get it much smaller than the original model. Toward the other extreme, the Atari 5200 infamously takes up more space than a full sized surround sound home theater amplifier despite containing not much more in terms of hardware than the 2600 did.

An analysis of the size of game consoles should also take into account hybrid abilities-while the first model Playstation 2 is enormous, it also played CDs and DVDs, so for non-audiophile consumers, despite its large size, the console actually saved space by sparing the person from buying a separate DVD and/or CD player. This integration of the home media center from a division of labor through things like component hi-fi systems to the current standard of “a TV with the cable box, internet and sometimes even gaming capabilities built right in” would seem to be a positive thing. Less physical volume of industrial production means less waste. But at the same time, it greatly increases hardware failure and makes it increasingly more and more complex to repair and salvage these pieces of hardware, increasing the quantity of eventual e-waste. Every Iphone X produced right now will eventually be unsalvageable e-waste because they’re designed to be completely proofed against user servicing down to putting in booby traps that will brick the phone if you make the slightest error try to do something as simple as changing the battery. This should be illegal and a massive issue, but doesn’t seem to be outside of right-to-repair circles.

Video games are also odd in that they thrive on constant format wars that would hobble most other industries. If there was an HD-DVD vs. Blu-Ray war every 5-7 years, would people still be purchasing home videos or would consumer confidence be shaken to the point they’d take a tech downgrade in favor of market stability? This is a rhetorical question of course, as that was what happened when VHS and Beta went at it. Similarly, it should be noted that the cliche that pornography determines the outcome of format wars is less true than the rephrasing game console integrated components determine the outcome of format wars. DVD rose to prominence because of its inclusion as a feature in the Playstation 2, and like many people, my first and only DVD player until I got to college was my PS2 slim. Blu-Ray probably vanquished HD-DVD because Sony sided with Blu-Ray when they designed the PS3. Sometimes these integrated components were good enough to eclipse the systems themselves. I have a PS1 that I exclusively use to play music CDs because it sounds substantially better than my other more high end CD playback devices. My only tablet computer is my Wii U gamepad.

Moving forward, it seems more and more likely the game console as a separate device meant specifically to play games will probably phase out. This however puts console manufacturers in a good place, as it gives them the opportunity to expand and seize market share from other large sectors of the home entertainment industry. The tendency towards people living in smaller and smaller spaces on less and less money makes the obviousness of the appeal unbeatable. There will still probably be a few guys like me with hanging-garden-of-babylon level cord tangling behind their media centers, but we’re a dying a breed.

 

CONCLUDING STATEMENTS (FOR NOW) :

Video games, at least older ones, are less dangerous as propaganda vehicles than the commercial cinema since they require your conscious input; the subconscious elements in a film that reify ideology and norms aren’t rendered especially legible. You aren’t supposed to forget your social impotence through abstract identification with a figure of power the way Wilhelm Reich described the psychological appeal of fascism and, inadvertently, the appeal of cookie cutter Joseph Campbell style action/adventure narratives in the commercial cinema. Their consideration is necessary for any comprehensive exploration of TV as a vehicle or medium; the way they work creates incompatibilities and bugs with existing methods of criticism for more established media formats that will need to be patched in a later update.

They’re an enormous part of the culture that isn’t going away, and the longer theorists of pop culture ignore them in favor of a narrow focus on the things that more closely resemble prose literature in their construction, the further said critics will slip into niche irrelevance. The hardware gives a palimpsest history of the most important private space of the 21st century-the living room, and present fantasy and escape in novel modes that will further illuminate just how those tendencies work.

Undone Season 1 Review

The team behind the scenes of the instant classic Bojack Horseman have introduced their new show, an experimental series using rotoscoped animation to explore the line between magic and mental illness. There’s a lot to like here, though much like Bojack, it takes a while to get going and fully reveal its direction.

First, a summing up of the plot: A woman named Alma is in a car accident and her father, a theoretical physicist who died in a mysterious car accident in 2002, starts appearing to her in visions. In these visions he tells her that she has been gifted with incredible powers to not only travel through time, but to change it. He claims that his car accident was a murder and tasks her with solving the murder and going back in time to stop it from occurring. Over the course of the series we are also introduced to her sister, her mother, and her boyfriend, all of whom become increasingly worried by her behavior, which resembles the symptoms of schizophrenia.

The first and most obviously challenging decision the show makes is to never resolve whether or not Alma is in fact mentally ill. This ambiguity isn’t a new thing for TV and movies and in fact it resembles the repeated themes of another ascendant TV auteur-Bryan Fuller, creator of Wonderfalls and Hannibal, two other shows where individuals are possessed by visions that problematize their sanity and lead them places they otherwise would never go. In particular Wonderfalls seems like a clear earlier reference point-a woman who works in a gift shop at Niagra Falls starts to see and hear inanimate objects talking to her and telling her to do things. They lead her on adventures and ultimately she does good things by listening to the objects despite the fact we’re never told whether she’s ill or clairvoyant.

Another obvious reference point is the Twin Peaks miniseries that aired a few years ago, particularly the final episode (spoilers ahead.) In the finale, Agent Dale Cooper somehow goes back in time thinking he can save Laura Palmer from being killed by her father but finds himself in a timeline with no Laura Palmer; his attempts to redeem the past by changing it does nothing; the chronology of time as experienced by the mind is non-linear. What happens in the future changes the past, or at least the imagined history-after all, history is, as it has famously been said, a lie agreed upon. But a lie must contain inconsistencies-a lie wants to live its own truth, and wants to do so in the present, where history must exist because no other moment can exist as anything besides recollection or projection.

Common to all these shows and many others, probably because of the frightening ambiguity faced by the US right now, is ultimately exploring our own uneasy feelings of being unsure whether we’re at the edge of a cliff or the top of a mountain and our lack of ideas what to do in either case. Alma’s trips into the past, slowly revealing details of her father’s own struggle with schizophrenia, don’t stabilize her or trigger catharsis. She just escalates the eccentricity of her behavior. The fact we open on the car accident also seems to suggest the two traumas-her car accident and the sudden loss of her father-are intertwined and that in fact she may be revisiting the familiar trauma of her parental loss to escape the unresolved trauma of her accident. At the same time, when she does travel into the past then references what she sees in the future, the accuracy of the details are confirmed.

The essence of trauma as a psychological phenomena is the confusion of the mind between the desire to “become whole” again, i.e. to revert before the moment of destabilization, and to move forward and grow, your only actual path that involves motion. In some sense, the experience of trauma and the repetitive quality that marks it could be rephrased as “the inability to accept the necessity of the present.” And in some sense, this inability to accept the necessity of the present implies the desire for non-existence, given that in order for things to be, that which has been must have been. Dale Cooper makes Laura Palmer not exist paradoxically by saving her; in the season 1 finale of Undone, Alma sits waiting for her (possibly imagined) father to emerge from Mexican ruins. In Undone, Alma’s sister tells her her problems and Alma simply replies that once she brings their father back from the dead, those problems and most of the things that mark her day to day life in the present will be erased. Alma looks excited at the prospect. Her symbolic act to access the truth of the moment of their father’s death is tossing her body at a mirror, breaking it-destruction of the image as symbolic suicide. Like many in the US right now, she’s not sure exactly what she wants but she knows it isn’t this.

The animation is well done but pretty textbook rotoscoping-people who’ve seen either of Richard Linklater’s two adventures into the form, Waking Life and A Scanner Darkly, will know what to expect here. At the same time, I think the dreaminess it adds to the proceedings justify the decision and make it seem at least like a progression more than strictly a reimagining of Fuller’s preoccupations. I don’t think Fuller is up to anything right now, maybe they should add him to the writers room. Him, Bob-Waksberg and Katy Purdy could form a TV super group and travel through time together exploring the nature of trauma.

The acting is uniformly strong. Bob Odenkirk turns in strong work here as the dead father which…there’re only so many ways you can say the guy’s a genius. The guy’s a genius. Rosa Salazar, who I don’t remember seeing in anything before this, does an exceptional job portraying Alma’s slow transformation into either a shaman or dangerously unstable individual, and does a particularly exceptional job conveying the unease that comes with those closest to you betraying your trust for fear you might hurt yourself. These situations are never portrayed as obvious-both sides are acting rationally given what they know.

The development of the plot makes it unclear where they could go from here; I look forward to the second season but when I try to think about what it could consist of once the season ending cliffhanger is resolved, I come up with a blank. But I guess that’s why they’re writing the show and I’m not.

Well worth checking out.

Ozymandias was problematic

I met a traveller from an antique land,
Who said—“Two vast and trunkless legs of stone
Stand in the desert. . . . Near them, on the sand,
Half sunk a shattered visage lies, whose frown,
And wrinkled lip, and sneer of cold command,
Tell that its sculptor well those passions read
Which yet survive, stamped on these lifeless things,
The hand that mocked them, and the heart that fed;
And on the pedestal, these words appear:
My name is Ozymandias, King of Kings;
Look on my Works, ye Mighty, and despair!
Nothing beside remains. Round the decay
Of that colossal Wreck, boundless and bare
The lone and level sands stretch far away.”

https://www.poetryfoundation.org/poems/46565/ozymandias

With the recent uproar over statues, I’m now beginning to wonder who Ozymandias was? We he an Egyptian Pharaoh? Was he an Assyrian or a Persian? Maybe he was my distant ancestor Genghis Khan?

And was Percy Shelley really making a timeless statement about the way the passing of time destroys the memory of great men? Or was he just another white Englishman rejoicing in the “erasure” of ancient cultures not Greek or Roman?

I’m going to have to rate this poem as “problematic,” especially if Ozymandias was a person of color.

I guess this is the kind of person we should be naming buildings after

My elementary school wasn’t named after a President, a slave owner, or even a Union Army general. It was named after a Red Cross nurse who treated soldiers in France in World War I and came home to treat people during the pandemic of 1918.

Supposedly it was considered unusual back then to name a public school after someone who wasn’t a famous politician or war hero. But my hometown was ahead of its time. They named it after a “healthcare hero.” I wonder if she would have been in favor of Medicare for All.

Tik Tok Teens Have Managers?

In addition to being unbelievably sad, there’s something quite revealing about the story of Siya Kakkar, a 16-year-old New Delhi girl who had over a million followers on the Chinese streaming media service. She not only committed suicide. She had a manager.

In the social media post, Viral wrote: “Arjun Sarin who just spoke to her last night for a song collaboration, and he says she was in a good mood and perfectly alright. Even he has no clue what went wrong that she had to go this way.”

https://www.tribuneindia.com/news/delhi/16-year-old-tiktok-star-siya-kakkar-commits-suicide-in-delhi-104335

I have no idea if Arjun Sarin is a big time entertainment executive or just a family friend, but it’s clear that Tik Tok, which has recently been declared the authentic voice of Generation Z, is also a marketing platform. Does a 16-year-old girl just become an overnight media sensation? Or is Tik Tok promoting children as viral stars in order to build a platform which can be used later for advertising, propaganda, or political disinformation? Who knows? The last time I was part of the demographic Tik Tok is targeting, Ronald Reagan was President. But it’s always worth examining any company that just kind of appears out of nowhere and suddenly changes the social media landscape.

Twitter, for example, was originally marketed as a hangout for outsiders and edgy radicals. Just ask any liberal about “Black Twitter” and she’ll probably tell you it’s the second coming of the Civil Rights Movement. But it seems clear to me that Twitter has always been a way for corporate America to corral independent journalists and left wing political activists into an online sheep pen where they can more easily be controlled, and where the discourse on the “left” can be brought into an intellectual framework that favors American imperialism. The number of “anarchists” on Twitter agitating for regime change in Syria and Iran is revealing. Get a little too out of line, the way I did last year, and you will immediately be de-platformed. Also, notice how Twitter’s “trending hashtags” blur the boundaries between advertising and organic discourse that comes from below. How many of them are genuine? How many have been paid for. And how exactly does Twitter make money anyway.

In any event, RIP Siya Kakkar. When I was 16-years-old I felt completely alone. I guess you felt the exact same way. I guess you can have a million followers and still not have anybody who really understands you.

Every once in awhile I realize just how much we deserved 9/11

cheney

So we now have a genuine American war criminal as a spokesman for public safety, offering himself up as the symbol of a “real man” to assure our brainwashed, fundamentalist Christian, inbred redneck masses that wearing a mask during the Coronavirus pandemic doesn’t make you a candidate for gay conversion therapy or a minion of George Soros. How charming. Well, I suppose that if Hitler had won the war, and the Wehrmacht had brought back some nasty bug from the frozen Russian steppe, we might have had Hermann Göring or Albert Speer, or maybe Baldur von Schirach doing a similar PSA. “Protect yourself Germans. Real Ayrans wear masks. Don’t let the Jews, Slavs and Gypsies win. We didn’t exterminate the Poles just so we could kill our grandfathers and grandmothers with the ‘Warsaw Virus.'”

Are we really debating the French Revolution in 2020?

So first the loathsome neoconservative Senator Lindsay Graham hilariously compares Jamaal Bowman and Charles Booker to the leaders of the French Revolution. Then some editor at the New York Times, correctly, points out that the French Revolution is the only reason why we have democracy in Europe and North America. Then he immediately gets spammed by thousands of racist, right wing assholes like Mike Cernovich, backs down, and winds up deleting his tweet.

french-1

Since I’m not an editor at the New York Times and can’t be “cancelled” for being too left wing, let me come out and say it. The French Revolution was good. The Reign of Terror was good. But don’t take my word for it, Mark Twain, who was good friends with the recently cancelled Ulysses Grant, put it best. Even though France eventually became a military dictatorship under Napoleon — and he was also good — and even though the monarchy was eventually restored in 1816, the Reign of Terror eliminated centuries of institutional poverty put in place by feudalism and the Catholic Church.

twain

But it gets better. Mike Cernovich, in addition to being a racist piece of shit and a rape apologist, is also a prominent advocate of antisemitic conspiracy theories like Pizzagate. How exactly do you get hired at the New York Times and then proceed to cower in fear of nasty little reactionaries like this? I don’t know. Can you tell I’m angry? Fucking cuck liberals will never, ever stand up for themselves, even when they’re absolutely correct about a historical event that outside of the extreme right isn’t even controversial anymore. If we can’t have a communist revolution, can we at the very least have liberals like Mark Twain and Ulysses Grant, liberals who would have laughed so hard at fascist little toads like Cernovich they would have literally pissed themselves? I suppose not.

But it gets better. As Stephen Eric Bronner points out in his book A Rumor about the Jews: Antisemitism, Conspiracy, and the Protocols of Zion, “scientific racism” and anti-Semitic conspiracy theories, two evil things that absolutely refuse to die and currently live in the white supremacist buffoon in the White House, have their origins in the conservative reaction against the French Revolution. Arthur de Gobineau, for example, who realized that since feudal hierarchies were dead, the European ruling class needed another myth to justify their power over the masses, namely “scientific racism.” So he came up with a crackpot theory that before 1789 the French Aristocracy was made up of racially pure Nordics (Franks and Scandinavians) and that the common people were Celts or Latins.

And the Germans, displaying the blond hair of their ancestors, emerged to rule in every corner of the world. Neptune and his trident serve the Anglo-Saxon, their last descendant, and the peopled deserts of young America know the strength of this heroic people. But as to the Romans, Alemanni, Gauls, […] to put it briefly, those who are not German are created to serve.

https://en.wikipedia.org/wiki/Arthur_de_Gobineau

American Boomers love to joke about the French surrendering to the Germans. Well here’s where it started. What’s more, anti-Semitic conspiracy theories like Pizzagate and the idea that George Soros controls the American left have similar origins, in the royalist reaction against the French Revolution, in the idea that hundreds of years of feudal oppression had nothing to do with what happened in 1789, that it was all a conspiracy of the “Bavarian Illuminati.”

A French Catholic priest called Augustin Barruel is generally regarded as one of history’s most famous conspiracy theorists. His multi-volume 1797 book, Memoirs Illustrating the History of Jacobinism, about an alleged conspiracy that led to the outbreak of the French Revolution, has been reprinted many times and translated into several languages.

Not long after the publication of his work, Barruel was sent a letter by a man called Jean Baptiste Simonini, who alleged that the Jews were also part of the conspiracy. This letter – the original of which has never been found – continues to shape antisemitic conspiracy thinking to this day.

https://theconversation.com/simoninis-letter-the-19th-century-text-that-influenced-antisemitic-conspiracy-theories-about-the-illuminati-134635

Yes, although it may have been penned by an agent of Czar Nicholas II, the Protocols of the Elders of Zion can be traced directly back to people who were pissed that Robespierre and Napoleon gave Jews equal citizenship. Oh ye ignorant Americans, for the 10,000 time, the French Revolution was good.