Barbie No, seriously! If anyone can make a good film about a doll franchise, it's probably Greta Gerwig. Not only was Little Women (2019) more than admirable, the same could be definitely said for Lady Bird (2017). More importantly, I can't help feel she was the real 'Driver' behind Frances Ha (2012), one of the better modern takes on Claudia Weill's revelatory Girlfriends (1978). Still, whenever I remember that Barbie will be a film about a billion-dollar toy and media franchise with a nettlesome history, I recall I rubbished the "Facebook film" that turned into The Social Network (2010). Anyway, the trailer for Barbie is worth watching, if only because it seems like a parody of itself.
Blitz It's difficult to overstate just how important the aerial bombing of London during World War II is crucial to understanding the British psyche, despite it being a constructed phenomenon from the outset. Without wishing to underplay the deaths of over 40,000 civilian deaths, Angus Calder pointed out in the 1990s that the modern mythology surrounding the event "did not evolve spontaneously; it was a propaganda construct directed as much at [then neutral] American opinion as at British." It will therefore be interesting to see how British Grenadian Trinidadian director Steve McQueen addresses a topic so essential to the British self-conception. (Remember the controversy in right-wing circles about the sole Indian soldier in Christopher Nolan's Dunkirk (2017)?) McQueen is perhaps best known for his 12 Years a Slave (2013), but he recently directed a six-part film anthology for the BBC which addressed the realities of post-Empire immigration to Britain, and this leads me to suspect he sees the Blitz and its surrounding mythology with a more critical perspective. But any attempt to complicate the story of World War II will be vigorously opposed in a way that will make the recent hullabaloo surrounding The Crown seem tame. All this is to say that the discourse surrounding this release may be as interesting as the film itself.
Dune, Part II Coming out of the cinema after the first part of Denis Vileneve's adaptation of Dune (2021), I was struck by the conception that it was less of a fresh adaptation of the 1965 novel by Frank Herbert than an attempt to rehabilitate David Lynch's 1984 version and in a broader sense, it was also an attempt to reestablish the primacy of cinema over streaming TV and the myriad of other distractions in our lives. I must admit I'm not a huge fan of the original novel, finding within it a certain prurience regarding hereditary military regimes and writing about them with a certain sense of glee that belies a secret admiration for them... not to mention an eyebrow-raising allegory for the Middle East. Still, Dune, Part II is going to be a fantastic spectacle.
Ferrari It'll be curious to see how this differs substantially from the recent Ford v Ferrari (2019), but given that Michael Mann's Heat (1995) so effectively re-energised the gangster/heist genre, I'm more than willing to kick the tires of this about the founder of the eponymous car manufacturer. I'm in the minority for preferring Mann's Thief (1981) over Heat, in part because the former deals in more abstract themes, so I'd have perhaps prefered to look forward to a more conceptual film from Mann over a story about one specific guy.
How Do You Live There are a few directors one can look forward to watching almost without qualification, and Hayao Miyazaki (My Neighbor Totoro, Kiki's Delivery Service, Princess Mononoke Howl's Moving Castle, etc.) is one of them. And this is especially so given that The Wind Rises (2013) was meant to be the last collaboration between Miyazaki and Studio Ghibli. Let's hope he is able to come out of retirement in another ten years.
Indiana Jones and the Dial of Destiny Given I had a strong dislike of Indiana Jones and the Kingdom of the Crystal Skull (2008), I seriously doubt I will enjoy anything this film has to show me, but with 1981's Raiders of the Lost Ark remaining one of my most treasured films (read my brief homage), I still feel a strong sense of obligation towards the Indiana Jones name, despite it feeling like the copper is being pulled out of the walls of this franchise today.
Kafka I only know Polish filmmaker Agnieszka Holland through her Spoor (2017), an adaptation of Olga Tokarczuk's 2009 eco-crime novel Drive Your Plow Over the Bones of the Dead. I wasn't an unqualified fan of Spoor (nor the book on which it is based), but I am interested in Holland's take on the life of Czech author Franz Kafka, an author enmeshed with twentieth-century art and philosophy, especially that of central Europe. Holland has mentioned she intends to tell the story "as a kind of collage," and I can hope that it is an adventurous take on the over-furrowed biopic genre. Or perhaps Gregor Samsa will awake from uneasy dreams to find himself transformed in his bed into a huge verminous biopic.
The Killer It'll be interesting to see what path David Fincher is taking today, especially after his puzzling and strangely cold Mank (2020) portraying the writing process behind Orson Welles' Citizen Kane (1941). The Killer is said to be a straight-to-Netflix thriller based on the graphic novel about a hired assassin, which makes me think of Fincher's Zodiac (2007), and, of course, Se7en (1995). I'm not as entranced by Fincher as I used to be, but any film with Michael Fassbender and Tilda Swinton (with a score by Trent Reznor) is always going to get my attention.
Killers of the Flower Moon In Killers of the Flower Moon, Martin Scorsese directs an adaptation of a book about the FBI's investigation into a conspiracy to murder Osage tribe members in the early years of the twentieth century in order to deprive them of their oil-rich land. (The only thing more quintessentially American than apple pie is a conspiracy combined with a genocide.) Separate from learning more about this disquieting chapter of American history, I'd love to discover what attracted Scorsese to this particular story: he's one of the few top-level directors who have the ability to lucidly articulate their intentions and motivations.
Napoleon It often strikes me that, despite all of his achievements and fame, it's somehow still possible to claim that Ridley Scott is relatively underrated compared to other directors working at the top level today. Besides that, though, I'm especially interested in this film, not least of all because I just read Tolstoy's War and Peace (read my recent review) and am working my way through the mind-boggling 431-minute Soviet TV adaptation, but also because several auteur filmmakers (including Stanley Kubrick) have tried to make a Napoleon epic and failed.
Oppenheimer In a way, a biopic about the scientist responsible for the atomic bomb and the Manhattan Project seems almost perfect material for Christopher Nolan. He can certainly rely on stars to queue up to be in his movies (Robert Downey Jr., Matt Damon, Kenneth Branagh, etc.), but whilst I'm certain it will be entertaining on many fronts, I fear it will fall into the well-established Nolan mould of yet another single man struggling with obsession, deception and guilt who is trying in vain to balance order and chaos in the world.
The Way of the Wind Marked by philosophical and spiritual overtones, all of Terrence Malick's films are perfumed with themes of transcendence, nature and the inevitable conflict between instinct and reason. My particular favourite is his stunning Days of Heaven (1978), but The Thin Red Line (1998) and A Hidden Life (2019) also touched me ways difficult to relate, and are one of the few films about the Second World War that don't touch off my sensitivity about them (see my remarks about Blitz above). It is therefore somewhat Malickian that his next film will be a biblical drama about the life of Jesus. Given Malick's filmography, I suspect this will be far more subdued than William Wyler's 1959 Ben-Hur and significantly more equivocal in its conviction compared to Paolo Pasolini's ardently progressive The Gospel According to St. Matthew (1964). However, little beyond that can be guessed, and the film may not even appear until 2024 or even 2025.
Zone of Interest I was mesmerised by Jonathan Glazer's Under the Skin (2013), and there is much to admire in his borderline 'revisionist gangster' film Sexy Beast (2000), so I will definitely be on the lookout for this one. The only thing making me hesitate is that Zone of Interest is based on a book by Martin Amis about a romance set inside the Auschwitz concentration camp. I haven't read the book, but Amis has something of a history in his grappling with the history of the twentieth century, and he seems to do it in a way that never sits right with me. But if Paul Verhoeven's Starship Troopers (1997) proves anything at all, it's all in the adaption.
(2020) Robert Kolker A compelling and disturbing account of the Galvin family six of whom were diagnosed with schizophrenia which details a journey through the study and misunderstanding of the condition. The story of the Galvin family offers a parallel history of the science of schizophrenia itself, from the era of institutionalisation, lobotomies and the 'schizo mother', to the contemporary search for genetic markers for the disease... all amidst fundamental disagreements about the nature of schizophrenia and, indeed, of all illnesses of the mind. Samples of the Galvins' DNA informed decades of research which, curiously, continues to this day, potentially offering paths to treatment, prediction and even eradication of the disease, although on this last point I fancy that I detect a kind of neo-Victorian hubris that we alone will be the ones to find a cure. Either way, a gentle yet ultimately tragic view of a curiously 'American' family, where the inherent lack of narrative satisfaction brings a frustration and sadness of its own.
Islands of Abandonment: Life in the Post-Human Landscape (2021) Cat Flyn In this disarmingly lyrical book, Cat Flyn addresses the twin questions of what happens after humans are gone and how far can our damage to nature be undone. From the forbidden areas of post-war France to the mining regions of Scotland, Islands of Abandonment explores the extraordinary places where humans no longer live in an attempt to give us a glimpse into what happens when mankind's impact on nature is, for one reason or another, forced to stop. Needless to say, if anxieties in this area are not curdling away in your subconscious mind, you are probably in some kind of denial. Through a journey into desolate, eerie and ravaged areas in the world, this artfully-written study offers profound insights into human nature, eschewing the usual dry sawdust of Wikipedia trivia. Indeed, I summed it up to a close friend remarking that, through some kind of hilarious administrative error, the book's publisher accidentally dispatched a poet instead of a scientist to write this book. With glimmers of hope within the (mostly) tragic travelogue, Islands of Abandonment is not only a compelling read, but also a fascinating insight into the relationship between Nature and Man.
The Anatomy of Fascism (2004) Robert O. Paxton Everyone is absolutely sure they know what fascism is... or at least they feel confident choosing from a buffet of features to suit the political mood. To be sure, this is not a new phenomenon: even as 'early' as 1946, George Orwell complained in Politics and the English Language that the word Fascism has now no meaning except in so far as it signifies something not desirable . Still, it has proved uncommonly hard to define the core nature of fascism and what differentiates it from related political movements. This is still of great significance in the twenty-first century, for the definition ultimately determines where the powerful label of 'fascist' can be applied today. Part of the enjoyment of reading this book was having my own cosy definition thoroughly dismantled and replaced with a robust system of abstractions and common themes. This is achieved through a study of the intellectual origins of fascism and how it played out in the streets of Berlin, Rome and Paris. Moreover, unlike Strongmen (see above), fascisms that failed to gain meaningful power are analysed too, including Oswald Mosley's British Union of Fascists. Curiously enough, Paxton's own definition of fascism is left to the final chapter, and by the time you reach it, you get an anti-climatic feeling of it being redundant. Indeed, whatever it actually is, fascism is really not quite like any other 'isms' at all, so to try and classify it like one might be a mistake. In his introduction, Paxton warns that many of those infamous images associated with fascism (eg. Hitler in Triumph of the Will, Mussolini speaking from a balcony, etc.) have the ability to induce facile errors about the fascist leader and the apparent compliance of the crowd. (Contemporary accounts often record how sceptical the common man was of the leader's political message, even if they were transfixed by their oratorical bombast.) As it happens, I thus believe I had something of an advantage of reading this via an audiobook, and completely avoided re-absorbing these iconic images. To me, this was an implicit reminder that, however you choose to reduce it to a definition, fascism is undoubtedly the most visual of all political forms, presenting itself to us in vivid and iconic primary images: ranks of disciplined marching youths, coloured-shirted militants beating up members of demonised minorities; the post-war pictures from the concentration camps... Still, regardless of you choose to read it, The Anatomy of Fascism is a powerful book that can teach a great deal about fascism in particular and history in general.
What Good are the Arts? (2005) John Carey What Good are the Arts? takes a delightfully sceptical look at the nature of art, and cuts through the sanctimony and cant that inevitably surrounds them. It begins by revealing the flaws in lofty aesthetic theories and, along the way, debunks the claims that art makes us better people. They may certainly bring joy into your life, but by no means do the fine arts make you automatically virtuous. Carey also rejects the entire enterprise of separating things into things that are art and things that are not, making a thoroughly convincing case that there is no transcendental category containing so-called 'true' works of art. But what is perhaps equally important to what Carey is claiming is the way he does all this. As in, this is an extremely enjoyable book to read, with not only a fine sense of pace and language, but a devilish sense of humour as well. To be clear, What Good are the Arts? it is no crotchety monograph: Leo Tolstoy's *What Is Art? (1897) is hilarious to read in similar ways, but you can't avoid feeling its cantankerous tone holds Tolstoy's argument back. By contrast, Carey makes his argument in a playful sort of manner, in a way that made me slightly sad to read other polemics throughout the year. It's definitely not that modern genre of boomer jeremiad about the young, political correctness or, heaven forbid, 'cancel culture'... which, incidentally, made Carey's 2014 memoir, The Unexpected Professor something of a disappointing follow-up. Just for fun, Carey later undermines his own argument by arguing at length for the value of one art in particular. Literature, Carey asserts, is the only art capable of reasoning and the only art with the ability to criticise. Perhaps so, and Carey spends a chapter or so contending that fiction has the exclusive power to inspire the mind and move the heart towards practical ends... or at least far better than any work of conceptual art. Whilst reading this book I found myself taking down innumerable quotations and laughing at the jokes far more than I disagreed. And the sustained and intellectual style of polemic makes this a pretty strong candidate for my favourite overall book of the year.
Publisher: | Alfred A. Knopf |
Copyright: | 2021 |
ISBN: | 0-593-32010-7 |
Format: | Kindle |
Pages: | 260 |
You were, quite literally, doing your job from home. But you weren't working from home. You were laboring in confinement and under duress. Others have described it as living at work. You were frantically tapping out an email while trying to make lunch and supervise distance learning. You were stuck alone in a cramped apartment for weeks, unable to see friends or family, exhausted, and managing a level of stress you didn't know was possible. Work became life, and life became work. You weren't thriving. You were surviving.The stated goal of this book is to reclaim the concept of working from home, not only from the pandemic, but also from the boundary-destroying metastasis of work into non-work life. It does work towards that goal, but the description of what would be required for working from home to live up to its promise becomes a sweeping critique of the organization and conception of work, leaving it nearly as applicable to those who continue working from an office. Turns out that the main problem with working from home is the work part, not the "from home" part. This was a fascinating book to read in conjunction with A World Without Email. Warzel and Petersen do the the structural and political analysis that I sometimes wish Newport would do more of, but as a result offer less concrete advice. Both, however, have similar diagnoses of the core problems of the sort of modern office work that could be done from home: it's poorly organized, poorly managed, and desperately inefficient. Rather than attempting to fix those problems, which is difficult, structural, and requires thought and institutional cooperation, we're compensating by working more. This both doesn't work and isn't sustainable. Newport has a background in productivity books and a love of systems and protocols, so his focus in A World Without Email is on building better systems of communication and organization of work. Warzel and Petersen come from a background of reporting and cultural critique, so they put more focus on power imbalances and power-serving myths about the American dream. Where Newport sees an easy-to-deploy ad hoc work style that isn't fit for purpose, Warzel and Petersen are more willing to point out intentional exploitation of workers in the guise of flexibility. But they arrive at some similar conclusions. The way office work is organized is not leading to more productivity. Tools like Slack encourage the public performance of apparent productivity at the cost of the attention and focus required to do meaningful work. And the process is making us miserable. Out of Office is, in part, a discussion of what would be required to do better work with less stress, but it also shares a goal with Newport and some (but not most) corners of productivity writing: spend less time and energy on work. The goal of Out of Office is not to get more work done. It's to work more efficiently and sustainably and thus work less. To reclaim the promise of flexibility so that it benefits the employee and not the employer. To recognize, in the authors' words, that the office can be a bully, locking people in to commute schedules and unnatural work patterns, although it also provides valuable moments of spontaneous human connection. Out of Office tries to envision a style of work that includes the office sometimes, home sometimes, time during the day to attend to personal chores or simply to take a mental break from an unnatural eight hours (or more) of continuous focus, universal design, real worker-centric flexibility, and an end to the constant productivity ratchet where faster work simply means more work for the same pay. That's a lot of topics for a short book, and structurally this is a grab bag. Some sections will land and some won't. Loom's video messages sound like a nightmare to me, and I rolled my eyes heavily at the VR boosterism, reluctant as it may be. The section on DEI (diversity, equity, and inclusion) was a valiant effort that at least gestures towards the dismal track record of most such efforts, but still left me unconvinced that anyone knows how to improve diversity in an existing organization without far more brute-force approaches than anyone with power is usually willing to consider. But there's enough here, and the authors move through topics quickly enough, that a section that isn't working for you will soon be over. And some of the sections that do work are great. For example, the whole discussion of management.
Many of these companies view middle management as bloat, waste, what David Graeber would call a "bullshit job." But that's because bad management is a waste; you're paying someone more money to essentially annoy everyone around them. And the more people experience that sort of bad management, and think of it as "just the way it is," the less they're going to value management in general.I admit to a lot of confirmation bias here, since I've been ranting about this for years, but management must be the most wide-spread professional job for which we ignore both training and capability and assume that anyone who can do any type of useful work can also manage people doing that work. It's simply not true, it creates workplaces full of horrible management, and that in turn creates a deep and unhelpful cynicism about all management. There is still a tendency on the left to frame this problem in terms of class struggle, on the reasonable grounds that for decades under "scientific management" of manufacturing that's what it was. Managers were there to overwork workers and extract more profits for the owners, and labor unions were there to fight back against managers. But while some of this does happen in the sort of office work this book is focused on, I think Warzel and Petersen correctly point to a different cause.
"The reason she was underpaid on the team was not because her boss was cackling in the corner. It was because nobody told the boss it was their responsibility to look at the fucking spreadsheet."We don't train managers, we have no clear expectations for what managers should do, we don't meaningfully measure their performance, we accept a high-overhead and high-chaos workstyle based on ad hoc one-to-one communication that de-emphasizes management, and many managers have never seen good management and therefore have no idea what they're supposed to be doing. The management problem for many office workers is less malicious management than incompetent management, or simply no effective management at all apart from an occasional reorg and a complicated and mind-numbing annual review form. The last section of this book (apart from concluding letters to bosses and workers) is on community, and more specifically on extracting time and energy from work (via the roadmap in previous chapters) and instead investing it in the people around you. Much ink has been spilled about the collapse of American civic life, about how we went from a nation of joiners to a nation of isolated individual workers with weak and failing community institutions. Warzel and Petersen correctly lay some blame for this at the foot of work, and see the reorganization of work and an increase in work from home (and thus a decrease in commutes) as an opportunity to reverse that trend. David Brooks recently filled in for Ezra Klein on his podcast and talked with University of Chicago professor Leon Kass, which I listened to shortly after reading this book. In one segment, they talked about marriage and complained about the decline in marriage rates. They were looking for causes in people's moral upbringing, in their life priorities, in the lack of aspiration for permanence in kids these days, and in any other personal or moral failing that would allow them to be smugly judgmental. It was a truly remarkable thing to witness. Neither man at any point in the conversation mentioned either money or time. Back in the world most Americans live in, real wages have been stagnant for decades, student loan debt is skyrocketing as people desperately try to keep up with the ever-shifting requirements for a halfway-decent job, and work has expanded to fill all hours of the day, even for people who don't have to work multiple jobs to make ends meet. Employers have fully embraced a "flexible" workforce via layoffs, micro-optimizing work scheduling, eliminating benefits, relying on contract and gig labor, and embracing exceptional levels of employee turnover. The American worker has far less of money, time, and stability, three important foundations for marriage and family as well as participation in most other civic institutions. People like Brooks and Kass stubbornly cling to their feelings of moral superiority instead of seeing a resource crisis. Work has stolen the resources that people previously put into those other areas of their life. And it's not even using those resources effectively. That's, in a way, a restatement of the topic of this book. Our current way of organizing work is not sustainable, healthy, or wise. Working from home may be part of a strategy for changing it. The pandemic has already heavily disrupted work, and some of those changes, including increased working from home, seem likely to stick. That provides a narrow opportunity to renegotiate our arrangement with work and try to make those changes stick. I largely agree with the analysis, but I'm pessimistic. I think the authors are as well. We're very bad at social change, and there will be immense pressure for everything to go "back to normal." Those in the best bargaining position to renegotiate work for themselves are not in the habit of sharing that renegotiation with anyone else. But I'm somewhat heartened by how much public discussion there currently is about a more fundamental renegotiation of the rules of office work. I'm also reminded of a deceptively profound aphorism from economist Herbert Stein: "If something cannot go on forever, it will stop." This book is a bit uneven and is more of a collection of related thoughts than a cohesive argument, but if you are hungry for more worker-centric analyses of the dynamics of office work (inside or outside the office), I think it's worth reading. Rating: 7 out of 10
The disease stiffened and carried off three or four patients who were expected to recover. These were the unfortunates of the plague, those whom it killed when hope was highIt somehow captured the nostalgic yearning for high-definition videos of cities and public transport; one character even visits the completely deserted railway station in Oman simply to read the timetables on the wall.
Small, podgy, and at best middle-aged, Smiley was by appearance one of London's meek who do not inherit the earth. His legs were short, his gait anything but agile, his dress costly, ill-fitting, and extremely wet.Almost a direct rebuttal to Ian Fleming's 007, Tinker, Tailor has broken-down cars, bad clothes, women with their own internal and external lives (!), pathetically primitive gadgets, and (contra Mad Men) hangovers that significantly longer than ten minutes. In fact, the main aspect that the mostly excellent 2011 film adaption doesn't really capture is the smoggy and run-down nature of 1970s London this is not your proto-Cool Britannia of Austin Powers or GTA:1969, the city is truly 'gritty' in the sense there is a thin film of dirt and grime on every surface imaginable. Another angle that the film cannot capture well is just how purposefully the novel does not mention the United States. Despite the US obviously being the dominant power, the British vacillate between pretending it doesn't exist or implying its irrelevance to the matter at hand. This is no mistake on Le Carr 's part, as careful readers are rewarded by finding this denial of US hegemony in metaphor throughout --pace Ian Fleming, there is no obvious Felix Leiter to loudly throw money at the problem or a Sheriff Pepper to serve as cartoon racist for the Brits to feel superior about. By contrast, I recall that a clever allusion to "dusty teabags" is subtly mirrored a few paragraphs later with a reference to the installation of a coffee machine in the office, likely symbolic of the omnipresent and unavoidable influence of America. (The officer class convince themselves that coffee is a European import.) Indeed, Le Carr communicates a feeling of being surrounded on all sides by the peeling wallpaper of Empire. Oftentimes, the writing style matches the graceless and inelegance of the world it depicts. The sentences are dense and you find your brain performing a fair amount of mid-flight sentence reconstruction, reparsing clauses, commas and conjunctions to interpret Le Carr 's intended meaning. In fact, in his eulogy-cum-analysis of Le Carr 's writing style, William Boyd, himself a ventrioquilist of Ian Fleming, named this intentional technique 'staccato'. Like the musical term, I suspect the effect of this literary staccato is as much about the impact it makes on a sentence as the imperceptible space it generates after it. Lastly, the large cast in this sprawling novel is completely believable, all the way from the Russian spymaster Karla to minor schoolboy Roach the latter possibly a stand-in for Le Carr himself. I got through the 500-odd pages in just a few days, somehow managing to hold the almost-absurdly complicated plot in my head. This is one of those classic books of the genre that made me wonder why I had not got around to it before.
Perhaps his life might have veered elsewhere if the US government had opened the country to colored advancement like they opened the army. But it was one thing to allow someone to kill for you and another to let him live next door.Sardonic aper us of this kind are pretty relentless throughout the book, but it never tips its hand too far into on nihilism, especially when some of the visual metaphors are often first-rate: "An American flag sighed on a pole" is one I can easily recall from memory. In general though, The Nickel Boys is not only more world-weary in tenor than his previous novel, the United States it describes seems almost too beaten down to have the energy conjure up the Swiftian magical realism that prevented The Underground Railroad from being overly lachrymose. Indeed, even we Whitehead transports us a present-day New York City, we can't indulge in another kind of fantasy, the one where America has solved its problems:
The Daily News review described the [Manhattan restaurant] as nouveau Southern, "down-home plates with a twist." What was the twist that it was soul food made by white people?It might be overly reductionist to connect Whitehead's tonal downshift with the racial justice movements of the past few years, but whatever the reason, we've ended up with a hard-hitting, crushing and frankly excellent book.
"Earlier tonight I gave some thought to stealing a kiss from you, though you are very young, and sick and unattractive to boot, but now I am of a mind to give you five or six good licks with my belt." "One would be as unpleasant as the other."Perhaps this should be unsurprising. Maddie, a fourteen-year-old girl from Yell County, Arkansas, can barely fire her father's heavy pistol, so she can only has words to wield as her weapon. Anyway, it's not just me who treasures this book. In her encomium that presages most modern editions, Donna Tartt of The Secret History fame traces the novels origins through Huckleberry Finn, praising its elegance and economy: "The plot of True Grit is uncomplicated and as pure in its way as one of the Canterbury Tales". I've read any Chaucer, but I am inclined to agree. Tartt also recalls that True Grit vanished almost entirely from the public eye after the release of John Wayne's flimsy cinematic vehicle in 1969 this earlier film was, Tartt believes, "good enough, but doesn't do the book justice". As it happens, reading a book with its big screen adaptation as a chaser has been a minor theme of my 2020, including P. D. James' The Children of Men, Kazuo Ishiguro's Never Let Me Go, Patricia Highsmith's Strangers on a Train, James Ellroy's The Black Dahlia, John Green's The Fault in Our Stars, John le Carr 's Tinker, Tailor Soldier, Spy and even a staged production of Charles Dicken's A Christmas Carol streamed from The Old Vic. For an autodidact with no academic background in literature or cinema, I've been finding this an effective and enjoyable means of getting closer to these fine books and films it is precisely where they deviate (or perhaps where they are deficient) that offers a means by which one can see how they were constructed. I've also found that adaptations can also tell you a lot about the culture in which they were made: take the 'straightwashing' in the film version of Strangers on a Train (1951) compared to the original novel, for example. It is certainly true that adaptions rarely (as Tartt put it) "do the book justice", but she might be also right to alight on a legal metaphor, for as the saying goes, to judge a movie in comparison to the book is to do both a disservice.
We're accustomed to worrying about AI systems being built that will either "go rogue" and attack us, or succeed us in a bizarre evolution of, um, evolution what we didn't reckon on is the sheer inscrutability of these manufactured minds. And minds is not a misnomer. How else should we think about the neural network Google has built so its translator can model the interrelation of all words in all languages, in a kind of three-dimensional "semantic space"?New Dark Age also turns its attention to the weird, algorithmically-derived products offered for sale on Amazon as well as the disturbing and abusive videos that are automatically uploaded by bots to YouTube. It should, by rights, be a mess of disparate ideas and concerns, but Bridle has a flair for introducing topics which reveals he comes to computer science from another discipline altogether; indeed, on a four-part series he made for Radio 4, he's primarily referred to as "an artist". Whilst New Dark Age has rather abstract section topics, Adam Greenfield's Radical Technologies is a rather different book altogether. Each chapter dissects one of the so-called 'radical' technologies that condition the choices available to us, asking how do they work, what challenges do they present to us and who ultimately benefits from their adoption. Greenfield takes his scalpel to smartphones, machine learning, cryptocurrencies, artificial intelligence, etc., and I don't think it would be unfair to say that starts and ends with a cynical point of view. He is no reactionary Luddite, though, and this is both informed and extremely well-explained, and it also lacks the lazy, affected and Private Eye-like cynicism of, say, Attack of the 50 Foot Blockchain. The books aren't a natural pair, for Bridle's writing contains quite a bit of air in places, ironically mimics the very 'clouds' he inveighs against. Greenfield's book, by contrast, as little air and much lower pH value. Still, it was more than refreshing to read two technology books that do not limit themselves to platitudinal booleans, be those dangerously naive (e.g. Kevin Kelly's The Inevitable) or relentlessly nihilistic (Shoshana Zuboff's The Age of Surveillance Capitalism). Sure, they are both anti-technology screeds, but they tend to make arguments about systems of power rather than specific companies and avoid being too anti-'Big Tech' through a narrower, Silicon Valley obsessed lens for that (dipping into some other 2020 reading of mine) I might suggest Wendy Liu's Abolish Silicon Valley or Scott Galloway's The Four. Still, both books are superlatively written. In fact, Adam Greenfield has some of the best non-fiction writing around, both in terms of how he can explain complicated concepts (particularly the smart contract mechanism of the Ethereum cryptocurrency) as well as in the extremely finely-crafted sentences I often felt that the writing style almost had no need to be that poetic, and I particularly enjoyed his fictional scenarios at the end of the book.
A better proxy for your life isn't your first home, but your last. Where you draw your last breath is more meaningful, as it's a reflection of your success and, more important, the number of people who care about your well-being. Your first house signals the meaningful your future and possibility. Your last home signals the profound the people who love you. Where you die, and who is around you at the end, is a strong signal of your success or failure in life.Nir Eyal's Indistractable, however, is a totally different kind of 'self-help' book. The important background story is that Eyal was the author of the widely-read Hooked which turned into a secular Bible of so-called 'addictive design'. (If you've ever been cornered by a techbro wielding a Wikipedia-thin knowledge of B. F. Skinner's behaviourist psychology and how it can get you to click 'Like' more often, it ultimately came from Hooked.) However, Eyal's latest effort is actually an extended mea culpa for his previous sin and he offers both high and low-level palliative advice on how to avoid falling for the tricks he so studiously espoused before. I suppose we should be thankful to capitalism for selling both cause and cure. Speaking of markets, there appears to be a growing appetite for books in this 'anti-distraction' category, and whilst I cannot claim to have done an exhausting study of this nascent field, Indistractable argues its points well without relying on accurate-but-dry "studies show..." or, worse, Gladwellian gotchas. My main criticism, however, would be that Eyal doesn't acknowledge the limits of a self-help approach to this problem; it seems that many of the issues he outlines are an inescapable part of the alienation in modern Western society, and the only way one can really avoid distraction is to move up the income ladder or move out to a 500-acre ranch.
Why is my e-reader still getting regular OS updates, while Google stopped issuing security patches for my smartphone four years ago?To try to answer this, let us turn to economic incentives theory. Although not the be-all and end-all some think it is3, incentives theory is not a bad tool to analyse this particular problem. Executives at Google most likely followed a very business-centric logic when they decided to drop support for the Nexus 5. Likewise, Rakuten Kobo's decision to continue updating older devices certainly had very little to do with ethics or loyalty to their user base. So, what are the incentives that keep Kobo updating devices and why are they different than smartphone manufacturers'? A portrait of the current long-term software support offerings for smartphones and e-readers Before delving deeper in economic theory, let's talk data. I'll be focusing on 2 brands of e-readers, Amazon's Kindle and Rakuten's Kobo. Although the e-reader market is highly segmented and differs a lot based on geography, Amazon was in 2015 the clear worldwide leader with 53% of the worldwide e-reader sales, followed by Rakuten Kobo at 13%4. On the smartphone side, I'll be differentiating between Apple's iPhones and Android devices, taking Google as the barometer for that ecosystem. As mentioned below, Google is sadly the leader in long-term Android software support. Rakuten Kobo According to their website and to this Wikipedia table, the only e-readers Kobo has deprecated are the original Kobo eReader and the Kobo WiFi N289, both released in 2010. This makes their oldest still supported device the Kobo Touch, released in 2011. In my book, that's a pretty good track record. Long-term software support does not seem to be advertised or to be a clear selling point in their marketing. Amazon According to their website, Amazon has dropped support for all 8 devices produced before the Kindle Paperwhite 2nd generation, first sold in 2013. To put things in perspective, the first Kindle came out in 2007, 3 years before Kobo started selling devices. Like Rakuten Kobo, Amazon does not make promises of long-term software support as part of their marketing. Apple Apple has a very clear software support policy for all their devices:
Owners of iPhone, iPad, iPod or Mac products may obtain a service and parts from Apple or Apple service providers for five years after the product is no longer sold or longer, where required by law.This means in the worst-case scenario of buying an iPhone model just as it is discontinued, one would get a minimum of 5 years of software support. Android Google's policy for their Android devices is to provide software support for 3 years after the launch date. If you buy a Pixel device just before the new one launches, you could theoretically only get 2 years of support. In 2018, Google decided OEMs would have to provide security updates for at least 2 years after launch, threatening not to license Google Apps and the Play Store if they didn't comply. A question of cost structure From the previous section, we can conclude that in general, e-readers seem to be supported longer than smartphones, and that Apple does a better job than Android OEMs, providing support for about twice as long. Even Fairphone, who's entire business is to build phones designed to last and to be repaired was not able to keep the Fairphone 1 (2013) updated for more than a couple years and seems to be struggling to keep the Fairphone 2 (2015) running an up to date version of Android. Anyone who has ever worked in IT will tell you: maintaining software over time is hard work and hard work by specialised workers is expensive. Most commercial electronic devices are sold and developed by for-profit enterprises and software support all comes down to a question of cost structure. If companies like Google or Fairphone are to be expected to provide long-term support for the devices they manufacture, they have to be able to fund their work somehow. In a perfect world, people would be paying for the cost of said long-term support, as it would likely be cheaper then buying new devices every few years and would certainly be better for the planet. Problem is, manufacturers aren't making them pay for it. Economists call this type of problem externalities: things that should be part of the cost of a good, but aren't for one a reason or another. A classic example of an externality is pollution. Clearly pollution is bad and leads to horrendous consequences, like climate change. Sane people agree we should drastically cut our greenhouse gas emissions, and yet, we aren't. Neo-classical economic theory argues the way to fix externalities like pollution is to internalise these costs, in other words, to make people pay for the "real price" of the goods they buy. In the case of climate change and pollution, neo-classical economic theory is plain wrong (spoiler alert: it often is), but this is where band-aids like the carbon tax comes from. Still, coming back to long-term software support, let's see what would happen if we were to try to internalise software maintenance costs. We can do this multiple ways. 1 - Include the price of software maintenance in the cost of the device This is the choice Fairphone makes. This might somewhat work out for them since they are a very small company, but it cannot scale for the following reasons:
Welcome to the July 2020 report from the Reproducible Builds project. In these monthly reports, we round-up the things that we have been up to over the past month. As a brief refresher, the motivation behind the Reproducible Builds effort is to ensure no flaws have been introduced from the original free software source code to the pre-compiled binaries we install on our systems. (If you re interested in contributing to the project, please visit our main website.)
ftp.debian.org
were made from their claimed sources.
Tavis Ormandy published a blog post making the provocative claim that You don t need reproducible builds , asserting elsewhere that the many attacks that have been extensively reported in our previous reports are fantasy threat models . A number of rebuttals have been made, including one from long-time contributor Reproducible Builds contributor Bernhard Wiedemann.
On our mailing list this month, Debian Developer Graham Inggs posted to our list asking for ideas why the openorienteering-mapper
Debian package was failing to build on the Reproducible Builds testing framework. Chris Lamb remarked from the build logs that the package may be missing a build dependency, although Graham then used our own diffoscope tool to show that the resulting package remains unchanged with or without it. Later, Nico Tyni noticed that the build failure may be due to the relationship between the FILE
C preprocessor macro and the -ffile-prefix-map
GCC flag.
An issue in Zephyr, a small-footprint kernel designed for use on resource-constrained systems, around .a
library files not being reproducible was closed after it was noticed that a key part of their toolchain was updated that now calls --enable-deterministic-archives
by default.
Reproducible Builds developer kpcyrd commented on a pull request against the libsodium cryptographic library wrapper for Rust, arguing against the testing of CPU features at compile-time. He noted that:
I ve accidentally shipped broken updates to users in the past because the build system was feature-tested and the final binary assumed the instructions would be present without further runtime checksDavid Kleuker also asked a question on our mailing list about using
SOURCE_DATE_EPOCH
with the install(1)
tool from GNU coreutils. When comparing two installed packages he noticed that the filesystem birth times differed between them. Chris Lamb replied, realising that this was actually a consequence of using an outdated version of diffoscope and that a fix was in diffoscope version 146 released in May 2020.
Later in July, John Scott posted asking for clarification regarding on the Javascript files on our website to add metadata for LibreJS, the browser extension that blocks non-free Javascript scripts from executing. Chris Lamb investigated the issue and realised that we could drop a number of unused Javascript files [ ][ ][ ] and added unminified versions of Bootstrap and jQuery [ ].
README
file [ ], marked the Alpine Linux continuous integration tests as currently disabled [ ] and linked the Arch Linux Reproducible Status page from our projects page [ ].
150
, 151
, 152
, 153
& 154
:
zipnote(1)
to determine differences in a .zip
file as we can use libarchive
. [ ]--profile
as a synonym for --profile=-
, ie. write profiling data to standard output. [ ]strings(1)
to eight characters to avoid unnecessary diff noise. [ ]--exclude-directory-metadata
and --no-exclude-directory-metadata
have been replaced with --exclude-directory-metadata= yes,no
. [ ]xxd(1)
and show bytes in groups of 4. [ ]javap not found in path
if it is available in the path but it did not result in an actual difference. [ ]... not available in path
messages when looking for Java decompilers that used the Python class name instead of the command. [ ]--debug
log noise by truncating the has_some_content
messages. [ ]compare_files
log message when the file does not have a literal name. [ ]exit_if_paths_do_not_exist
to not check files multiple times. [ ][ ]add_comment
helper method; don t mess with our internal list directly. [ ]str.format
with Python f-strings [ ] and make it easier to navigate to the main.py
entry point [ ].None
in the failure case as we return a non-None
value in the success one. [ ]NullChanges
quasi-file to represent missing data in the Debian package comparator [ ] and clarify use of a null diff in order to remember an exit code. [ ]diffoscope @args.txt
. (!62)objdump
[ ][ ] and remove raw instructions from ELF tests [ ].--verbose
-level warning when the Archive::Cpio Perl module is missing. (!6)
reprotest is our end-user tool to build same source code twice in widely differing environments and then checks the binaries produced by each build for any differences. This month, Vagrant Cascadian made a number of changes to support diffoscope version 153 which had removed the (deprecated) --exclude-directory-metadata
and --no-exclude-directory-metadata
command-line arguments, and updated the testing configuration to also test under Python version 3.8 [ ].
debhelper
build tool impacting the reproducibility status of hundreds of packages that use the CMake build system. This month however, Niels Thykier uploaded debhelper
version 13.2 that passes the -DCMAKE_SKIP_RPATH=ON
and -DBUILD_RPATH_USE_ORIGIN=ON
arguments to CMake when using the (currently-experimental) Debhelper compatibility level 14.
According to Niels, this change:
should fix some reproducibility issues, but may cause breakage if packages run binaries directly from the build directory.34 reviews of Debian packages were added, 14 were updated and 20 were removed this month adding to our knowledge about identified issues. Chris Lamb added and categorised the
nondeterministic_order_of_debhelper_snippets_added_by_dh_fortran_mod
[ ] and gem2deb_install_mkmf_log
[ ] toolchain issues.
Lastly, Holger Levsen filed two more wishlist bugs against the debrebuild
Debian package rebuilder tool [ ][ ].
afl
(fix an incorrectly built manual page varied from kernel boot options)brp-check-suse
(sorting issue)dnscrypt-proxy
(sort the output of find(1)
)graphviz
(timezone issue, forwarded from Debian)guile-gcrypt
(parallelism)insighttoolkit
(prevent CPU detection, forwarded upstreamipopt
(parallelism issue and use https://tracker.debian.org/pkg/strip-nondeterminism)jboss-logging-tools
(date, forwarded upstream)kismet
(date)lcov
(date issue, already upstream)multus
(date issue, already upstream)multus
(date)paperjam
(date issue, forwarded upstream)pspp
(scrub testsuite.log
)python-PyNaCl
(sort Python glob/readdir)python-enaml
(workaround an open upstream Python issue)sac
(omit creation time from .zip
files)sql-parser
(sort, already upstream)ugrep
(CPU-related issue, already upstream)ugrep
(CPU-related issue)unknown-horizons
(filesystem ordering issue, already upstream)unknown-horizons
(filesystem ordering issue)xfce4-panel-profiles
(POSIX.1-2001/pax headers)yast2-sound
(uses uname -r
)apache-sshd
(date)tests.reproducible-builds.org
.
This month, Holger Levsen made the following changes:
sbuild
exit code. [ ][ ]php-horde
packages back to the pkg-php-pear
package set for the bullseye distribution. [ ]debrebuild
. [ ]logrotate
[ ], pbuilder
[ ], NetBSD [ ], unkillable processes [ ], unresponsive nodes [ ][ ][ ][ ], proxy connection failures [ ], too many installed kernels [ ], etc.systemd
units. [ ]init_node
script to suggest using sudo instead of explicit logout and logins [ ][ ] and the usual build node maintenance was performed by Holger Levsen [ ][ ][ ][ ][ ][ ], Mattia Rizzolo [ ][ ] and Vagrant Cascadian [ ][ ][ ][ ].
#reproducible-builds
on irc.oftc.net
.
rb-general@lists.reproducible-builds.org
Publisher: | Bantam Spectra |
Copyright: | May 1988 |
Printing: | July 1989 |
ISBN: | 0-553-27903-3 |
Format: | Mass market |
Pages: | 552 |
She shimmers, my city, she shimmers. She is said to be the most beautiful of all the cities of the Civilized Worlds, more beautiful even than Parpallaix or the cathedral cities of Vesper. To the west, pushing into the green sea like a huge, jewel-studded sleeve of city, the fragile obsidian cloisters and hospices of the Farsider's Quarter gleamed like black glass mirrors. Straight ahead as we skated, I saw the frothy churn of the Sound and their whitecaps of breakers crashing against the cliffs of North Beach and above the entire city, veined with purple and glazed with snow and ice, Waaskel and Attakel rose up like vast pyramids against the sky. Beneath the half-ring of extinct volcanoes (Urkel, I should mention, is the southernmost peak, and though less magnificent than the others, it has a conical symmetry that some find pleasing) the towers and spires of the Academy scattered the dazzling false winter light so that the whole of the Old City sparkled.That's less than half of that paragraph, and the entire book is written like that, even in the middle of conversations. Endless, constant words piled on words about absolutely everything, whether important or not, whether emotionally significant or not. And much of it isn't even description, but philosophical ponderings that are desperately trying to seem profound. Here's another bit:
Although I knew I had never seen her before, I felt as if I had known her all my life. I was instantly in love with her, not, of course, as one loves another human being, but as a wanderer might love a new ocean or a gorgeous snowy peak he has glimpsed for the first time. I was practically struck dumb by her calmness and her beauty, so I said the first stupid thing which came to mind. "Welcome to Neverness," I told her.Now, I should be fair: some people like this kind of description, or at least have more tolerance for it than I do. But that brings me to the second problem: there isn't a single truly likable character in this entire novel. Ringess, the person telling us this whole story, is a spoiled man-child, the sort of deeply immature and insecure person who attempts to compensate through bluster, impetuousness, and refusing to ever admit that he made a mistake or needed to learn something. He spends a good portion of the book, particularly the deeply bizarre and off-putting sections with the fake Neanderthals, attempting to act out some sort of stereotyped toxic masculinity and wallowing in negative emotions. Soli is an arrogant, abusive asshole from start to finish. Katherine, Ringess's love interest, is a seer who has had her eyes removed to see the future (I cannot express how disturbing I found Zindell's descriptions of this), has bizarre and weirdly sexualized reactions to the future she never explains, and leaves off the ends of all of her sentences, which might be be the most pointlessly irritating dialogue quirk I've seen in a novel. And Ringess's mother is a man-hating feminist from a separatist culture who turns into a master manipulator (I'm starting to see why Card liked this book). I at least really wanted to like Bardo, Ringess's closest friend, who has a sort of crude loyalty and unwillingness to get pulled too deep into the philosophical quicksand lurking underneath everything in this novel. Alas, Zindell insists on constantly describing Bardo's odious eating, belching, and sexual habits every time he's on the page, thus reducing him to the disgusting buffoon who gets drunk a lot and has irritating verbal ticks. About the only person I could stand by the end of the book was Justine, who at least seems vaguely sensible (and who leaves the person who abuses her), but she's too much of a non-entity to carry sustained interest. (There is potential here for a deeply scathing and vicious retelling of this story from Justine's point of view, focusing on the ways she was belittled, abused, and ignored, but I think Zindell was entirely unaware of why that would be so effective.) Oh, and there's lots of gore and horrific injury and lovingly-described torture, because of course there is. And that brings me back to the second half of that St. Louis Post-Dispatch review quote: "... really comes to life among the intrigues of Neverness." I would love to know what was hiding behind the ellipses in this pull quote, because this half-sentence is not wrong. Insofar as Neverness has any real appeal, it's in the intrigues of the city of Neverness and in the political structure that rules it. What this quote omits is that these intrigues start around page 317, more than halfway through the novel. That's about the point where faux-Wolfe starts mixing with late-career Frank Herbert and we get poet-assassins, some revelations about the leader of the Pilot culture, and some more concrete explanations of what this mess of a book is about. Unfortunately, you have to read through the huge and essentially meaningless Neanderthal scenes to get there, scenes that have essentially nothing to do with the interesting content of this book. (Everything that motivates them turns out to be completely irrelevant to the plot and useless for the characters.) The last 40% of the book is almost passable, and characters I cared about might have even made it enjoyable. Still, a couple of remaining problems detract heavily, chief among them the lack of connection of the great revelation of the story to, well, anything in the story. We learn at the very start of the novel that the stars of the Vild are mysteriously exploding, and much of the novel is driven by uncovering an explanation and solution. The characters do find an explanation, but not through any investigation. Ringess is simply told what is happening, in a wad of exposition, as a reward for something else entirely. It's weirdly disconnected from and irrelevant to everything else in the story. (There are some faint connections to the odd technological rules that the Pilot society lives under, but Zindell doesn't even draw attention to those.) The political intrigue in Neverness is similar: it appears out of nowhere more than halfway through the book, with no dramatic foundation for the motives of the person who has been keeping most of the secrets. And the final climax of the political machinations involves a bunch of mystical nonsense masquerading as science, and more of the Neanderthal bullshit that ruins the first half of the book. This is a thoroughly bad book: poorly plotted, poorly written, clotted and pretentious in style, and full of sociopaths and emotionally stunted children. I read the whole thing because I'm immensely stubborn and make poor life choices, but I was saying the eight deadly words ("I don't care what happens to these people") by a hundred pages in. Don't emulate my bad decisions. (Somehow, this novel was shortlisted for the Arthur C. Clarke award in 1990. What on earth could they possibly have been thinking?) Neverness is a stand-alone novel, but the ending sets up a subsequent trilogy that I have no intention of reading. Followed by The Broken God. Rating: 2 out of 10
tcpdump
to extract packets from network interfaces. The goal of this
change is to allow zero-copy transfers between user-space applications
and the NIC (network interface card) transmit and receive ring buffers.
Such optimizations are useful for telecommunications companies, which
may use it for deep packet
inspection or
running exotic protocols in user space. Another use case is running a
high-performance intrusion detection
system that
needs to watch large traffic streams in realtime to catch certain types
of attacks.
Fastabend presented his work during the Netdev network-performance
workshop, but also brought the patch set up for discussion during
Netconf. There, he said he could achieve line-rate extraction (and
injection) of packets, with packet rates as high as 30Mpps. This
performance gain is possible because user-space pages are directly
DMA-mapped to the NIC, which is also a security concern. The other
downside of this approach is that a complete pair of ring buffers needs
to be dedicated for this purpose; whereas before packets were copied to
user space, now they are memory-mapped, so the user-space side needs to
process those packets quickly otherwise they are simply dropped.
Furthermore, it's an "all or nothing" approach; while NIC-level
classifiers could be used to steer part of the traffic to a specific
queue, once traffic hits that queue, it is only accessible through the
af_packet interface and not the rest of the regular stack. If done
correctly, however, this could actually improve the way user-space
stacks access those packets, providing projects like DPDK a safer way to
share pages with the NIC, because it is well defined and
kernel-controlled. According to Jesper Dangaard Brouer (during review of
this article):
This proposal will be a safer way to share raw packet data between user space and kernel space than what DPDK is doing, [by providing] a cleaner separation as we keep driver code in the kernel where it belongs.During the Netdev network-performance workshop, Fastabend asked if there was a better data structure to use for such a purpose. The goal here is to provide a consistent interface to user space regardless of the driver or hardware used to extract packets from the wire. af_packet currently defines its own packet format that abstracts away the NIC-specific details, but there are other possible formats. For example, someone in the audience proposed the virtio packet format. Alexei Starovoitov rejected this idea because af_packet is a kernel-specific facility while virtio has its own separate specification with its own requirements. The next step for af_packet is the posting of the new "v4" patch set, although Miller warned that this wouldn't get merged until proper XDP support lands in the Intel drivers. The concern, of course, is that the kernel would have multiple incomplete bypass solutions available at once. Hopefully, Fastabend will present the (by then) merged patch set at the next Netdev conference in November.
iptables
in order to generate cBPF (classic BPF) mitigation rules that
are then deployed on edge routers. Those rules are created with a tool
called bpfgen
, part of Cloudflare's BSD-licensed
bpftools suite. For example,
it could create a cBPF bytecode blob that would match DNS queries to any
example.com
domain with something like:
bpfgen dns *.example.com
Originally, Cloudflare would deploy those rules to plain iptables
firewalls with the xt_bpf
module, but this led to performance issues.
It then deployed a proprietary user-space solution based on
Solarflare hardware, but this has the
performance limitations of user-space applications getting packets
back onto the wire involves the cost of re-injecting packets back into
the kernel. This is why Cloudflare is experimenting with XDP, which was
partly developed in response to the company's problems, to deploy those
BPF programs.
A concern that Bertin identified was the lack of visibility into dropped
packets. Cloudflare currently samples some of the dropped traffic to
analyze attacks; this is not currently possible with XDP unless you pass
the packets down the stack, which is expensive. Miller agreed that the
lack of monitoring for XDP programs is a large issue that needs to be
resolved, and suggested creating a way to mark packets for extraction to
allow analysis. Cloudflare is currently in a testing phase with XDP and
it is unclear if its whole XDP tool chain will be publicly available.
While those two companies are starting to use XDP as-is, there is more
work needed to complete the XDP project. As mentioned above and in our
previous coverage, massive statistics
extraction is still limited in the Linux kernel and introspection is
difficult. Furthermore, while the existing
actions (XDP_DROP
and XDP_TX
, see the
documentation
for more information) are well implemented and used, another action may
be introduced, called XDP_REDIRECT
, which would allow redirecting
packets to different network interfaces. Such an action could also be
used to accelerate bridges as packets could be "switched" based on the
MAC address table. XDP also requires network driver support, which is
currently limited. For example, the Intel drivers still do not support
XDP, although that should come pretty soon.
Miller, in his Netdev keynote, focused on XDP and presented it as the
standard solution that is safe, fast, and usable. He identified the next
steps of XDP development to be the addition of debugging mechanisms,
better sampling tools for statistics and analysis, and user-space
consistency. Miller foresees a future for XDP similar to the
popularization of the Arduino chips: a simple set of tools that anyone,
not just developers, can use. He gave the example of an Arduino tutorial
that he followed where he could just look up a part number and get
easy-to-use instructions on how to program it. Similar components should
be available for XDP. For this purpose, the conference saw the creation
of a new mailing list called
xdp-newbies where
people can learn how to create XDP build environments and how to write
XDP programs.
/dock
or /comms
that would allow you to dock a
ship or communicate with the Death Star. Those API endpoints should
obviously be public, but then there is this /exhaust-port
endpoint
that should never be publicly available. In order for a firewall to
protect such a system, it must be able to inspect traffic at a higher
level than the traditional address-port pairs. Graf presented a design
where the kernel would create an in-kernel socket that would negotiate
TCP connections on behalf of user space and then be able to apply
arbitrary eBPF rules in the kernel.
In this scenario, instead of doing the traditional transfer from
Netfilter's TPROXY to user space, the kernel directly decapsulates the
HTTP traffic and passes it to BPF rules that can make decisions without
doing expensive context switches or memory copies in the case of simply
wanting to refuse traffic (e.g. issue an HTTP 403 error). This, of
course, requires the inclusion of kTLS to process
HTTPS connections. HTTP2 support may also prove problematic, as it
multiplexes connections and is harder to decapsulate. This design was
described as a "pure pre-accept()
hook". Starovoitov also compared the
design to the kernel connection multiplexer (KCM).
Tom Herbert, KCM's author, agreed that it could be extended to support
this, but would require some extensions in user space to provide an
interface between regular socket-based applications and the KCM layer.
In any case, if the application does TLS (and lots of them do), kTLS
gets tricky because it breaks the end-to-end nature of TLS, in effect
becoming a man in the middle between the client and the application.
Eric Dumazet argued that HA-Proxy already
does things like this: it uses splice()
to avoid copying too much data
around, but it still does a context switch to hand over processing to
user space, something that could be fixed in the general case.
Another similar project that was
presented at
Netdev is the Tempesta
firewall and reverse-proxy. The speaker, Alex Krizhanovsky, explained
the Tempesta developers have taken one person month to port the mbed
TLS stack to the Linux kernel to allow an
in-kernel TLS handshake. Tempesta also implements rate limiting,
cookies, and JavaScript challenges to mitigate DDoS attacks. The
argument behind the project is that "it's easier to move TLS to the
kernel than it is to move the TCP/IP stack to user space". Graf
explained that he is familiar with Krizhanovsky's work and he is hoping
to collaborate. In effect, the design Graf is working on would serve as
a foundation for Krizhanovsky's in-kernel HTTP server (kHTTP). In a
private email, Graf explained that:
The main differences in the implementation are currently that we foresee to use BPF for protocol parsing to avoid having to implement every single application protocol natively in the kernel. Tempesta likely sees this less of an issue as they are probably only targeting HTTP/1.1 and HTTP/2 and to some [extent] JavaScript.Neither project is really ready for production yet. There didn't seem to be any significant pushback from key network developers against the idea, which surprised some people, so it is likely we will see more and more layer-7 intelligence move into the kernel sooner rather than later.
The author would like to thank the Netdev and Netconf organizers for travel assistance, Thomas Graf for a review of the in-kernel proxying section of this article, and Jesper Dangaard Brouer for review of the af_packet and XDP sections. Note: this article first appeared in the Linux Weekly News.
SO_BINDTODEVICE
generic socket option (see
socket(7))IP_PKTINFO
, IP-specific socket option (see
ip(7)), introduced in Linux 2.2IP_UNICAST_IF
flag, introduced in Linux 3.3 for WINEbind()
system call more programmable so
that, for example, you could bind()
a UDP socket to a discrete list of
IP addresses or a subnet. So he identified this issue as a broader
problem that should be addressed by making the interfaces more generic.
Ahern explained that it is currently possible to bind sockets to the
slave device of a VRF even though that should not be allowed. He also
raised the question of how the kernel should tell which socket should be
selected for incoming packets. Right now, there is a scoring mechanism
for UDP sockets, but that cannot be used directly in this more general
case.
David Miller said that there are already different ways of specifying
scope: there is the VRF layer and the namespace ("netns") layer. A long
time ago, Miller reluctantly accepted the addition of netns keys
everywhere, swallowing the performance cost to gain flexibility. He
argued that a new key should not be added and instead existing
infrastructure should be reused. Herbert argued this was exactly the
reason why this should be simplified: "if we don't answer the question,
people will keep on trying this". For example, one can use a VRF to
limit listening addresses, but it gets complicated if we need a device
for every address. It seems the consensus evolved towards using,
IP_UNICAST_IF
, added back in 2012, which is accessible for non-root
users. It is currently limited to UDP and RAW sockets, but it could be
extended for TCP.
0x806
for ARP). Even with eBPF, it should be possible to improve the
output. Alexei Starovoitov, the BPF maintainer, explained that it might
make sense to start by returning information about the maps used by an
eBPF program. Then more complex data structures could be inspected once
we know their type.
The first priority is to get simple debugging tools working but, in the
long term, the goal is a full decompiler that can reconstruct
instructions into a human-readable program. The question that remains is
how to return this data. Ahern explained that right now the bpf()
system call copies the data to a different file descriptor, but it could
just fill in a buffer. Starovoitov argued for a file descriptor; that
would allow the kernel to stream everything through the same descriptor
instead of having many attach points. Netlink cannot be used for this
because of its asynchronous nature.
A similar issue regarding the way we identify express data
path (XDP) programs (which are
also written in BPF) was raised by Daniel Borkmann from Covalent. Miller
explained that users will want ways to figure out which XDP program was
installed, so XDP needs an introspection mechanism. We currently have
SHA-1 identifiers that can be internally used to tell which binary is
currently loaded but those are not exposed to user space. Starovoitov
mentioned it is now just a boolean that shows if a program is loaded or
not.
A use case for this, on top of just trying to figure out which BPF
program is loaded, is to actually fetch the source code of a BPF program
that was deployed in the field for which the source was lost. It is
still uncertain that it will be possible to extract an exact copy that
could then be recompiled into the same program. Starovoitov added that
he needed this in production to do proper reporting.
devlink
could
do this, as it has access to the full device tree and, therefore, to
devices that can be addressed by (say) PCI identifiers. Then a possible
devlink
command could look something like:
devlink dev pci/0000:03:00.0 option set foo bar
This idea raised a bunch of extra questions: some devices don't have a
one-to-one mapping with the PCI bridge identifiers, for example, meaning
that those identifiers cannot be used to access such devices. Another
issue is that you may want to send multiple settings in a single
transaction, which doesn't fit well in the devlink
model. Miller then
proposed to let the driver initialize itself to some state and wait for
configuration to be sent when necessary. Another way would be to
unregister the driver and re-register with the given configuration.
Shrijeet Mukherjee explained that right now, Cumulus is doing this using
horrible startup script magic by retrying and re-registering, but it
would be nice to have a more standard way to do this.
The author would like to thank the Netconf and Netdev organizers for travel to, and hosting assistance in, Toronto. Many thanks to Alexei Starovoitov for his time taken for a technical review of this article. Note: this article first appeared in the Linux Weekly News.
SOURCE_DATE_EPOCH
support in autoloads files, but upstream already disabled timestamps by default some time before.Series: | Dune #1 |
Publisher: | Ace |
Copyright: | 1965 |
Printing: | September 1990 |
ISBN: | 0-441-17271-7 |
Format: | Mass market |
Pages: | 537 |
Next.