It’s Friday, which means that tonight, many of us will sit down to watch a movie with our family, our friends, our significant other, or — for some cinephiles, best of all — by ourselves. If you haven’t yet lined up any home-cinematic experience in particular, consider taking a look at this playlist of 31 feature films just made available to stream by Warner Bros. You’ll know the name of that august Hollywood studio, of course, but did you know that it put out True Stories, the musical plunge into tabloid America directed by Talking Heads’ David Byrne? Or Waiting for Guffman, the first improvised movie by Christopher Guest and his troupe of crack comedic players like Eugene Levy, Fred Willard, Catherine O’Hara, and Parker Posey?
That may already strike many Open Culture readers as the makings of a fine double feature, though some may prefer to watch the early work of another kind of auteur: Michel Gondry’s The Science of Sleep, say, or Richard Linklater’s SubUrbia (a stage-play adaptation that could well be paired with Sidney Lumet’s Deathtrap).
But if you’re just looking to have some fun, there’s no reason you couldn’t fire up the likes of Mr. Nice Guy, Jackie Chan’s first English-language picture. Should that prove too refined, Warner Bros. has also generously made available American Ninja V — a non-canonical entry in that series, we should note, starring not original American Ninja Michael Dudikoff, but direct-to-video martial-arts icon David Bradley. On Friday night, after all, any viewing goes.
Based in Seoul, Colin Marshall writes and broadcasts on cities, language, and culture. His projects include the Substack newsletterBooks on Cities and the book The Stateless City: a Walk through 21st-Century Los Angeles. Follow him on the social network formerly known as Twitter at @colinmarshall.
Music is often described as the most abstract of all the arts, and arguably the least visual as well. But these qualities, which seem so basic to the nature of the form, have been challenged for at least three centuries, not least by composers themselves. Take Antonio Vivaldi, whose Le quattro stagioni, or The Four Seasons, of 1718–1720 evoke not just broad impressions of the eponymous parts of the year, but a variety of natural and human elements characteristic to them. In the course of less than an hour, its listeners — whether of the early eighteenth century or the early twenty-first — “see” spring, summer, autumn, and winter unfold vividly before their mind’s eye.
Now, composer Stephen Malinowski has visualized The Four Seasons in an entirely different way. As previouslyfeatured here on Open Culture, he uses his Music Animation Machine to create what we might call graphical scores, which abstractly represent the instrumental parts that make up widely loved classical compositions in time with the music itself.
On this page, you can watch four videos, with each one visualizing one of the piece’s concerti. Fans of the Music Animation Machine will notice that its formerly simple visuals have taken a big step forward, though what can look at first like a psychedelic light show also has a clear and legible order.
?si=xEQ3Twamh82m3Yhr
For “Spring” and “Autumn,” Malinowski animates performances by violinist Shunske Sato and musicians of the Netherlands Bach Society; for “Summer” and “Winter,” performances by Cynthia Miller Freivogel and early-music ensemble Voices of Music (previously featured here for their renditions of Bach’s Brandenburg Concertos and “Air on the G String,”Pachelbel’s Canon, and indeed The Four Seasons). Generally understandable at a glance — and in many ways, more illuminating than actually seeing the musicians play their instruments — these scores also use a system called “harmonic coloring,” which Malinkowski explains here. This may add up to a complete audiovisual experience, but if you’d also like a literary element, why not pull up The Four Seasons’ accompanying sonnets while you’re at it?
Based in Seoul, Colin Marshall writes and broadcasts on cities, language, and culture. His projects include the Substack newsletterBooks on Cities and the book The Stateless City: a Walk through 21st-Century Los Angeles. Follow him on the social network formerly known as Twitter at @colinmarshall.
As you’ve probably noticed if you’re a regular reader of this site, we’re big fans of book illustration, particularly that from the form’s golden age—the late 18th and 19th century—before photography took over as the dominant visual medium. But while photographs largely supplanted illustrations in textbooks, magazines, and newspapers over the course of the 20th century, works of fiction, which had been routinely published in lavishly illustrated editions, suddenly became the featureless banks of words we know today. Though image-heavy graphic novels and comic books have thrived in recent decades, the illustrated literary text is a rarity indeed.
Why did this change come about? “I really don’t know,” writes Christopher Howse at The Telegraph, but he points out that the era of illustrated fiction for grown-ups ended “after the death of the big Victorian novelists,” like Dickens and Trollope. Before adult picture-books went out of style, several now-famous artists made careers as book illustrators. When we think of the big names from the period, we think of Aubrey Beardsley and Gustave Doré, both of whom we’ve covered heavily here. We tend not to think of Irish artist Harry Clarke—a relative latecomer—but we should. Of the many incredible illustrations from famous works of literature we’ve featured here, my favorite might be Clarke’s 1926 illustrations of Goethe’s Faust.
So out-there are some of his illustrations, so delightfully nightmarish and weird, one is tempted to fall back on that rather sophomoric explanation for art we find disturbing: maybe he was on drugs! Not that he’d need them to conjure up many of the images he did. His source material is bizarre enough (maybe Goethe was on drugs!). In any case, we can definitely call Clarke’s work hallucinatory, and that goes for his earlier, 1923 illustrations of Edgar Allan Poe’s Tales of Mystery and Imagination as well, of which you can see a few choice examples here.
Dublin-born Clarke worked as a stained-glass artist as well as an illustrator, and drew his inspiration from the earlier art nouveau aesthetic of Beardsley and others, adding his own rococo flourishes to the elongated forms and decorative patterns favored by those artists. His glowering figures—including one who looks quite a bit like Poe himself, at the top—suit the feverish intensity of Poe’s world to perfection. And like Poe, Clarke’s art generally thrived in a seductively dark underworld filled with ghouls and fiends. Both of these proto-goths died young, Poe under mysterious circumstances at age 40, Clarke of tuberculosis at 42.
We made sand think: this phrase is used from time to time to evoke the particular technological wonders of our age, especially since artificial intelligence seems to be back on the slate of possibilities. While there would be no Silicon Valley without silica sand, semiconductors are hardly the first marvel humanity has forged out of that kind of material. Consider the three millennia of history behind the traditional Japanese sword, long known even outside the Japanese language as the katana (literally “one-sided blade”) — or, more to the point of the Veritasium video above, the 1,200 years in which such weapons have been made out of steel. How Japanese Masters Turn Sand Into Swords
In explaining the science of the katana, Veritasium host Derek Muller begins more than two and a half billion years ago, when Earth’s oceans were “rich with dissolved iron.” But then, cyanobacteria started photosynthesizing that iron and creating oxygen as a by-product. This process dropped layers of iron onto the sea floor, which eventually hardened into layers of sedimentary rock.
With few such formations of its own, the geologically volcanic Japan actually came late to steel, importing it long before it could manage domestic production using the iron oxide that accumulated in its rivers, recovered as “iron sand.”
By that time, iron swords would no longer cut it, as it were, but the addition of charcoal in the heating process could produce the “incredibly strong alloy” of steel. Certain Japanese swordsmiths have continued to use steel made with the more or less traditional smelting process you can see performed in rural Shimane prefecture in the video. To the disappointment of its producer, Petr Lebedev, who participates in the whole process, the foot-operated bellows of yore have been electrified, but he hardly seems disappointed by his chance to take up a katana himself. He may have yet to attain the skill of a master swordsman, but understanding every scientific detail of the weapon he wields must make slicing bamboo clean in half that much more satisfying.
Based in Seoul, Colin Marshall writes and broadcasts on cities, language, and culture. His projects include the Substack newsletterBooks on Cities and the book The Stateless City: a Walk through 21st-Century Los Angeles. Follow him on the social network formerly known as Twitter at @colinmarshall.
Charlie Chaplin started appearing in his first films in 1914—40 films, to be precise—and, by 1915, the United States had a major case of “Chaplinitis.” Chaplin mustaches were suddenly popping up everywhere–as were Chaplin imitators and Chaplin look-alike contests. A young Bob Hope apparently won one such contest in Cleveland. Chaplin Fever continued burning hot through 1921, the year when the Chaplin look-alike contest, shown above, was held outside the Liberty Theatre in Bellingham, Washington.
According to legend, somewhere between 1915 and 1921, Chaplin decided to enter a Chaplin look-alike contest, and lost, badly.
A short article called “How Charlie Chaplin Failed,” appearing in The Straits Times of Singapore in August of 1920, read like this:
Lord Desborough, presiding at a dinner of the Anglo-Saxon club told a story which will have an enduring life. It comes from Miss Mary Pickford who told it to Lady Desborough, “Charlie Chaplin was one day at a fair in the United States, where a principal attraction was a competition as to who could best imitate the Charlie Chaplin walk. The real Charlie Chaplin thought there might be a chance for him so he entered for the performance, minus his celebrated moustache and his boots. He was a frightful failure and came in twentieth.
A variation on the same story appeared in a New Zealand newspaper, the Poverty Bay Herald, again in 1920. As did another story in the Australian newspaper, the Albany Advertiser, in March, 1921.
A competition in Charlie Chaplin impersonations was held in California recently. There was something like 40 competitors, and Charlie Chaplin, as a joke, entered the contest under an assumed name. He impersonated his well known film self. But he did not win; he was 27th in the competition.
Did Chaplin come in 20th place? 27th place? Did he enter a contest at all? It’s fun to imagine that he did. But, a century later, many consider the story the stuff of urban legend. When one researcher asked the Association Chaplin to weigh in, they apparently had this to say: “This anecdote told by Lord Desborough, whoever he may have been, was quite widely reported in the British press at the time. There are no other references to such a competition in any other press clipping albums that I have seen so I can only assume that this is the source of that rumour, urban myth, whatever it is. However, it may be true.”
I’d like to believe it is.
Note: An earlier version of this post appeared on our site in early 2016.
We can all remember seeing images of medieval Europeans wearing pointy shoes, but most of us have paid scant attention to the shoes themselves. That may be for the best, since the more we dwell on one fact of life in the Middle Ages or another, the more we imagine how uncomfortable or even painful it must have been by our standards. Dentistry would be the most vivid example, but even that fashionable, vaguely elfin footwear inflicted suffering, especially at the height of its popularity — not least among flashy young men — in the fourteenth and fifteenth centuries.
Called poulaines, a name drawn from the French word for Poland in reference to the footwear’s supposedly Polish origin, these pointy shoes appeared around the time of Richard II’s marriage to Anne of Bohemia in 1382. “Both men and women wore them, although the aristocratic men’s shoes tended to have the longest toes, sometimes as long as five inches,” writes Ars Technica’s Jennifer Ouellette. “The toes were typically stuffed with moss, wool, or horsehair to help them hold their shape.” If you’ve ever watched the first Blackadder series, know that the shoes worn by Rowan Atkinson’s hapless plotting prince may be comic, but they’re not an exaggeration.
Regardless, he was a bit behind the times, given that the show was set in 1485, right when poulaines went out of fashion. But they’d already done their damage, as evidenced by a 2021 study linking their wearing to nasty foot disorders. “Bunions — or hallux valgus — are bulges that appear on the side of the foot as the big toe leans in towards the other toes and the first metatarsal bone points outwards,” writes the Guardian’s Nicola Davis. A team of University of Cambridge researchers found signs of them being more prevalent in the remains of individuals buried in the fourteenth and fifteenth centuries than those buried from the eleventh through the thirteenth centuries.
Yet bunions were hardly the evil against which the poulaine’s contemporary critics inveighed. After the Great Pestilence of 1348, says the London Museum, “clerics claimed the plague was sent by God to punish Londoners for their sins, especially sexual sins.” The shoes’ lascivious associations continued to draw ire: “In 1362, Pope Urban V passed an edict banning them, but it didn’t really stop anybody from wearing them.” Then came sumptuary laws, according to which “commoners were charged to wear shorter poulaines than barons and knights.” The power of the state may be as nothing against that of the fashion cycle, but had there been a law against the bluntly square-toed shoes in vogue when I was in high school, I can’t say I would’ve objected.
Based in Seoul, Colin Marshall writes and broadcasts on cities, language, and culture. His projects include the Substack newsletterBooks on Cities and the book The Stateless City: a Walk through 21st-Century Los Angeles. Follow him on the social network formerly known as Twitter at @colinmarshall.
There have been many theories of how human history works. Some, like German thinker G.W.F. Hegel, have thought of progress as inevitable. Others have embraced a more static view, full of “Great Men” and an immutable natural order. Then we have the counter-Enlightenment thinker Giambattista Vico. The 18th century Neapolitan philosopher took human irrationalism seriously, and wrote about our tendency to rely on myth and metaphor rather than reason or nature. Vico’s most “revolutionary move,” wrote Isaiah Berlin, “is to have denied the doctrine of a timeless natural law” that could be “known in principle to any man, at any time, anywhere.”
Vico’s theory of history included inevitable periods of decline (and heavily influenced the historical thinking of James Joyce and Friedrich Nietzsche). He describes his concept “most colorfully,” writes Alexander Bertland at the Internet Encyclopedia of Philosophy, “when he gives this axiom”:
Men first felt necessity then look for utility, next attend to comfort, still later amuse themselves with pleasure, thence grow dissolute in luxury, and finally go mad and waste their substance.
The description may remind us of Shakespeare’s “Seven Ages of Man.” But for Vico, Bertland notes, every decline heralds a new beginning. History is “presented clearly as a circular motion in which nations rise and fall… over and over again.”
Two-hundred and twenty years after Vico’s 1774 death, Carl Sagan—another thinker who took human irrationalism seriously—published his book The Demon Haunted World, showing how much our everyday thinking derives from metaphor, mythology, and superstition. He also foresaw a future in which his nation, the U.S., would fall into a period of terrible decline:
I have a foreboding of an America in my children’s or grandchildren’s time — when the United States is a service and information economy; when nearly all the manufacturing industries have slipped away to other countries; when awesome technological powers are in the hands of a very few, and no one representing the public interest can even grasp the issues; when the people have lost the ability to set their own agendas or knowledgeably question those in authority; when, clutching our crystals and nervously consulting our horoscopes, our critical faculties in decline, unable to distinguish between what feels good and what’s true, we slide, almost without noticing, back into superstition and darkness…
Sagan believed in progress and, unlike Vico, thought that “timeless natural law” is discoverable with the tools of science. And yet, he feared “the candle in the dark” of science would be snuffed out by “the dumbing down of America…”
…most evident in the slow decay of substantive content in the enormously influential media, the 30 second sound bites (now down to 10 seconds or less), lowest common denominator programming, credulous presentations on pseudoscience and superstition, but especially a kind of celebration of ignorance…
Sagan died in 1996, a year after he wrote these words. No doubt he would have seen the fine art of distracting and misinforming people through social media as a late, perhaps terminal, sign of the demise of scientific thinking. His passionate advocacy for science education stemmed from his conviction that we must and can reverse the downward trend.
As he says in the poetic excerpt from Cosmos above, “I believe our future depends powerfully on how well we understand this cosmos in which we float like a mote of dust in the morning sky.”
When Sagan refers to “our” understanding of science, he does not mean, as he says above, a “very few” technocrats, academics, and research scientists. Sagan invested so much effort in popular books and television because he believed that all of us needed to use the tools of science: “a way of thinking,” not just “a body of knowledge.” Without scientific thinking, we cannot grasp the most important issues we all jointly face.
We’ve arranged a civilization in which most crucial elements profoundly depend on science and technology. We have also arranged things so that almost no one understands science and technology. This is a prescription for disaster. We might get away with it for a while, but sooner or later this combustible mixture of ignorance and power is going to blow up in our faces.
Sagan’s 1995 predictions are now being heralded as prophetic. As Director of Public Radio International’s Science Friday, Charles Bergquist tweeted, “Carl Sagan had either a time machine or a crystal ball.” Matt Novak cautions against falling back into superstitious thinking in our praise of Demon Haunted World. After all, he says, “the ‘accuracy’ of predictions is often a Rorschach test” and “some of Sagan’s concerns” in other parts of the book “sound rather quaint.”
Of course Sagan couldn’t predict the future, but he did have a very informed, rigorous understanding of the issues of thirty years ago, and his prediction extrapolates from trends that have only continued to deepen. If the tools of science education—like most of the country’s wealth—end up the sole property of an elite, the rest of us will fall back into a state of gross ignorance, “superstition and darkness.” Whether we might come back around again to progress, as Giambattista Vico thought, is a matter of sheer conjecture. But perhaps there’s still time to reverse the trend before the worst arrives. As Novak writes, “here’s hoping Sagan, one of the smartest people of the 20th century, was wrong.”
Note: An earlier version of this post appeared on our site in 2017.
One would count neither Elon Musk nor Neil deGrasse Tyson among the most reserved public figures of the twenty-first century. Given the efforts Musk has been making to push into the business of outer space, which has long been Tyson’s intellectual domain, it’s only natural that the two would come into conflict. Not long ago, the media eagerly latched on to signs of a “feud” that seemed to erupt between them over Tyson’s remark that Musk — or rather, his company SpaceX — “hasn’t done anything that NASA hasn’t already done. The actual space frontier is still held by NASA.”
What this means is that SpaceX has yet to take humanity anywhere in outer space we haven’t been before. That’s not a condemnation, but in fact a description of business as usual. “The history of really expensive things ever happening in civilization has, in essentially every case, been led, geopolitically, by nations,” Tyson says in the StarTalk video above. “Nations lead expensive projects, and when the costs of these projects are understood, the risks are quantified, and the time frames are established, then private enterprise comes in later, to see if they can make a buck off of it.”
To go, boldly or otherwise, “where no one has gone before often involves risk that a company that has investors will not take, unless there’s a very clear return on investment. Governments don’t need a financial return on investment if they can get a geopolitical return on investment.” Though private enterprise may be doing more or less what NASA has been doing for 60 years, Tyson hastens to add, private enterprise does do it cheaper. In that sense, “SpaceX has been advancing the engineering frontier of space exploration,” not least by its development of reusable rockets. Still, that’s not exactly the Final Frontier.
Musk has made no secret of his aspirations to get to Mars, but Tyson doesn’t see that eventuality as being led by SpaceX per se. “The United States decides, ‘We need to send astronauts to Mars,’ ” he imagines. “Then NASA looks around and says, ‘We don’t have a rocket to do that.’ And then Elon says ‘I have a rocket!’ and rolls out his rocket to Mars. Then we ride in the SpaceX rocket to Mars.” That scenario will look even more possible if the unmanned Mars missions SpaceX has announced go according to plan. Whatever their differences, Tyson and Musk — and every true space enthusiast — surely agree that it doesn’t matter where the money comes from, just as long as we get out there one day soon.
Based in Seoul, Colin Marshall writes and broadcasts on cities, language, and culture. His projects include the Substack newsletterBooks on Cities and the book The Stateless City: a Walk through 21st-Century Los Angeles. Follow him on the social network formerly known as Twitter at @colinmarshall.
Not everyone on August 1, 1981 had a VCR at their disposal, and not everybody stayed up until midnight. But fortunately at least one person did, in order to tape the first two hours of a new cable channel called MTV: Music Television. Did they know it would be historic? MTV certainly hoped it would be: they equated the premiere of this 24/7 video version of radio with the moon landing. People born long after this time might wonder why a MTV Music Video award statuette was honoring Buzz Aldrin. But at the time, it made sense. “Ladies and Gentlemen, Rock and Roll.” It was a statement: less than three decades after the first rock and roll single, this genre of music had won—-it had colonized the planet. And beyond the planet, the next stop: the universe.
It’s fitting the execs chose as their first selection The Buggles’ “Video Killed the Radio Star.” Visuals were not just going to be an adjunct to the music, they were going to become inextricably linked. Either MTV was prescient about the visual decade to come or they in fact caused it to happen. Music videos or short films had been around since the invention of sound in the cinema, but MTV was *all* videos, *all the time*, brought to Americans due to the deregulation of the television industry in 1972 and the slow growth of cable channels.
After a Pat Benatar video, the VJs introduce themselves—-Mark Goodman, Nina Blackwood, J.J. Jackson, Alan Hunter, and Martha Quinn (all soon to be household names and crushes)-—and then straight into a block of commercials: school binders, Superman II, and Dolby Noise Reduction. A strange group of advertisers, to be sure. Goodman returns to ask, blindly, “Aren’t those guys the best?” Goodman has no idea what has preceded him.
Yes, the first day of MTV was pretty rough. In fact, it’s a bit like a DJ who turns up to a gig to find they’ve left most of their records across town. In the first two hours we get two Rod Stewart songs, two by the Pretenders, two by Split Ends, another Pat Benatar video, two from Styx, and two from the concert film for the People of Kampuchea. We also get completely obscure videos: PH.D. “Little Susie’s on the Up”, Robin Lane and the Chartbusters “When Things Go Wrong”, Michael Johnson “Bluer Than Blue”. This is D‑list stuff. No wonder MTV premiered at midnight.
From these humble beginnings the channel would soon find its groove and two years later it would become ubiquitous in American households.
People predicted the end of MTV right from the beginning. It would be a fad, or it would run out of videos to play. Forty years later, the channel has rebranded itself into oblivion. And while music videos still get made, none have the effect that those first two decades had on generations of viewers. To paraphrase the Buggles, we have seen the playback and it seems so long ago.
Note: An earlier version of this post appeared on our site in 2021.
It’s practically guaranteed that we now have more stupid people on the planet than ever before. Of course, we might be tempted to think; just look at how many of them disagree with mypolitics. But this unprecedented stupidity is primarily, if not entirely, a function of an unprecedentedly large global population. The more important matter has less to do with quantity of stupidity than with its quality: of all the forms it can take, which does the most damage? Robert Greene, author of The 48 Laws of Power and The Laws of Human Nature, addresses that question in the clip above from an interview with podcaster Chris Williamson.
“What makes people stupid,” Greene explains, “is their certainty that they have all the answers.” The basic idea may sound familiar, since we’ve previously featured here on Open Culture the related phenomenon of the Dunning-Kruger effect. In some sense, stupid people who know they’re stupid aren’t actually stupid, or at least not harmfully so.
True to form, Greene makes a classical reference: Athens’ leaders went into the Peloponnesian War certain of victory, when it actually brought about the end of the Athenian golden age. “People who are certain of things are very stupid,” he says, “and when they have power, they’re very, very dangerous,” perhaps more so than those we would call evil.
This brings to mind the oft-quoted principle known as Hanlon’s Razor: “Never attribute to malice that which is adequately explained by stupidity.” But even in otherwise intelligent individuals, a tendency toward premature certainty can induce that stupidity. Better, in Greene’s view, to cultivate what John Keats, inspired by Shakespeare, called “negative capability”: the power to “hold two thoughts in your head at the same time, two thoughts that apparently contradict each other.” We might consider, for instance, entertaining the ideas of our aforementioned political enemies — not fully accepting them, mind you, but also not fully accepting our own. It may, at least, prevent the onset of stupidity, a condition that’s clearly difficult to cure.
Based in Seoul, Colin Marshall writes and broadcasts on cities, language, and culture. His projects include the Substack newsletterBooks on Cities and the book The Stateless City: a Walk through 21st-Century Los Angeles. Follow him on the social network formerly known as Twitter at @colinmarshall.
Note: Yesterday, Marianne Faithfull passed away at age 78. In her memory, we’re bringing back a favorite from deep in our archive. It originally appeared on our site in June 2012.
That film came out in 1966, two years before the immortal Airplane show but well into Godard’s first major burst of daring creativity, which began with 1959’s Breathless and lasted at least until Sympathy for the Devil, his 1968 documentary on — or, anyway, including — the Rolling Stones. Brody pointed specifically to the clip above, a brief scene where Marianne Faithfull sings “As Tears Go By,” a hit, in separate recordings, for both Faithfull and the Stones.
Brody notes how these two minutes of a cappella performance from the 19-year-old Faithfull depict the “styles of the day.” For a long time since that day, alas, we American filmgoers hadn’t had a chance to fully experience Made in U.S.A. Godard based its script on Donald E. Westlake’s novel The Jugger but never bothered to secure adaptation rights, and the film drifted in legal limbo until 2009. But today, with that red tape cut, crisp new prints circulate freely around the United States. Keep an eye on your local revival house’s listings so you won’t miss your chance to witness Faithfull’s café performance, and other such Godardian moments, in their theatrical glory. The cinephilically intrepid Brody, of course, found a way to see it, after a fashion, nearly thirty years before its legitimate American release: “The Mudd Club (the White Street night spot and music venue) got hold of a 16-mm. print and showed it — with the projector in the room — to a crowd of heavy smokers. It was like watching a movie outdoors in London by night, or as if through the shrouding mists of time.”
If you would like to sign up for Open Culture’s free email newsletter, please find it here. It’s a great way to see our new posts, all bundled in one email, each day.
If you would like to support the mission of Open Culture, consider making a donation to our site. It’s hard to rely 100% on ads, and your contributions will help us continue providing the best free cultural and educational materials to learners everywhere. You can contribute through PayPal, Patreon, and Venmo (@openculture). Thanks!
We're hoping to rely on loyal readers, rather than erratic ads. Please click the Donate button and support Open Culture. You can use Paypal, Venmo, Patreon, even Crypto! We thank you!
Open Culture scours the web for the best educational media. We find the free courses and audio books you need, the language lessons & educational videos you want, and plenty of enlightenment in between.