Former flight attendant Kat Kamalani has a serious warning for every passenger hopping onboard a plane, urging them to avoid on specific drink or face the potential consequences
Kat Kamalani has a warning for plane passengers(Image: Instagram / Tiktok)
Most of us barely give it a second thought when the trolley rattles down the aisle and a flight attendant offers a hot drink.
A cup of tea or coffee feels like a small reward after the hassle of airport security and squeezing into a narrow seat. But former flight attendant Kat Kamalani has a warning for every passenger: try to avoid drinking coffee, tea, or any water on a plane unless it comes in a sealed bottle or can.
She shared a clip on her Instagram account in which she issues a general warning to passengers. She explains: “Don’t you ever, ever, ever consume these products from an airplane, from a flight attendant. Rule number one: never consume any liquid that is not in a can or a bottle.”
Travel experts Ski Vertigo back this up, advising travellers to buy drinks at the airport instead. Not only does this avoid the unpleasant risks, but it can also be cheaper, especially on charter flights.
In her viral video, Kat reveals a side of in-flight drinks that many travellers don’t know about. She explains: “Those water tanks are never cleaned, and they are disgusting.” Many flight crews “rarely, rarely drink the coffee or tea” served on board because it all comes from the same water tank. These little coffee guys (coffee machines) are rarely cleaned unless they are broken. These guys (coffee kettles) are taken out and cleaned in between flights, but the whole machine is never cleaned. And they’re in the lavatories.”
She also suggests that parents should avoid asking for hot water to put in a baby’s bottle, although not doing so could prove very inconvenient.
While airlines insist they follow safety standards, once water travels through the aircraft’s tanks and pipes, it’s hard to guarantee it’s clean. That’s why experts and insiders now strongly suggest avoiding hot drinks made from tank water, especially if you’re pregnant, have a weaker immune system, or are travelling with young children.
Kat’s advice for parents is simple. She says: “Never ask for hot water and put it in your baby’s bottle. Ask for a bottle of water on the side and hot water in a cup. Then make your baby a bottle with the bottled water and put it in the cup and heat it up.”
For adults, Kat’s warning is just as clear: if your drink didn’t come from a sealed bottle or can, think twice before drinking it. The best approach is to stick to drinks that never go near the aircraft’s tanks, bottled water, canned soft drinks, or juice—and say no to tea, coffee, and even ice, which is often made from the same tap water.
A multi-agency search is under way off the coast of the Republic of Ireland for a missing Royal Navy crew member.
The Irish Department of Transport (DoT) said in a statement that the crew member was last seen at about 22:30 (local time) on Friday when the boat was near Tory Island, County Donegal.
It said the Irish Coast Guard received a distress call at about 09:00 on Saturday.
The Ministry of Defence (MoD) said the Royal Navy is assisting in the search and rescue operation to find an individual from the Royal Fleet Auxiliary.
The Irish Coast Guard, the Irish Air Corps, the RNLI, the Royal Navy vessel and others are involved in the search which is to continue overnight.
A spokesperson for the DoT said the Malin Head Coast Guard is co-ordinating the search off the northwest coast between Tory Island and Eagle Island in County Mayo.
“The search from the air is being conducted by the Coast Guard’s fixed-wing plane Rescue 120F, based in Shannon airport; Coast Guard helicopter Rescue 118, based in Sligo; and the Irish Air Corps plane, CASA 284,” they continued.
The Royal Navy support vessel and three RNLI all weather lifeboats based at Ballyglass, Arranmore Island and Lough Swilly, alongside other vessels are also involved in the sea search.
As Jaime Moore prepares to take the helm of the Los Angeles Fire Department, he said he plans to commission an outside investigation into missteps by fire officials during the mop-up of a small brush fire that reignited days later into the destructive Palisades fire.
Mayor Karen Bass had requested a probe late last month in response to reporting by The Times that firefighters were ordered to roll up their hoses and leave the burn area, even though they had complained that the ground was still smoldering.
Moore — a 30-year department veteran whose appointment was confirmed Friday by the Los Angeles City Council — said the reports have generated “understandable mistrust” in the agency.
The Times found that at least one chief assigned to LAFD’s risk management section knew about the complaints for months, but that the department kept that information hidden despite Palisades fire victims pleading for answers about whether more could have been done to protect their community.
On Wednesday, Moore told the council’s public safety committee that bringing in an outside organization to investigate the LAFD’s handling of the Jan. 1 Lachman fire would be one of his first moves as chief.
“Transparency and accountability are vital to ensure that we learn from every incident and is essential if we are to restore confidence in our Fire Department,” Moore said. “As fire chief, I will focus on rebuilding trust, not just with the public, but within the LAFD itself.”
Federal investigators say the Lachman fire was deliberately set on New Years’ Day and burned underground in a canyon root system until it was rekindled by high winds on Jan. 7. LAFD officials have said they believed the earlier fire was fully extinguished.
Moore said one of his top priorities is raising morale in a department that has come under heavy criticism for its handling of the worst wildfire in city history, which killed 12 people and destroyed thousands of homes.
In the days after the Jan. 7 Palisades fire, The Times reported that LAFD decided not to pre-deploy any engines or firefighters to the Palisades — as they had done in the past — despite being warned that some of the most dangerous winds in recent years were headed for the region.
An LAFD after-action report released last month described fire officials’ chaotic response, which included major staffing and communication issues.
Moore — who has the backing of the United Firefighters of Los Angeles City, the union that represents firefighters — said his other priorities include better preparation for major disasters, with a focus on pre-deployment and staffing, as well as for the 2026 World Cup and the 2028 Olympics.
“I’ve got skin in the game,” he said, adding that his son is an LAFD firefighter. “We need to address the amount of calls they’re going on, and make sure that they’re going on the right calls with the right resources, and if that means us having to change our department model, so be it. I have the courage to do that.”
He also said he wants to expand the LAFD’s technological capabilities and better deploy the equipment it already has, like the thermal imaging cameras and heat-detecting drones that officials did not deploy during the Lachman fire mop-up.
“We are now requiring them to be used, and we’re not picking up any type of hose until we know that we’ve been able to identify through the use of the drone, thermal imaging cameras to ensure that those surface hot spots are all taken care of,” he said.
“I wish it didn’t take this for us to have to learn the lesson about using the tools we already have,” Councilmember Traci Park replied.
Park grilled Moore on reporting by The Times that firefighters warned a battalion chief about the Lachman fire not being fully extinguished.
“We know now that our own firefighters on the ground were offering warnings that it was still too hot, that it was still too smoldering,” Park said. “For Palisades residents and Angelenos across the city who have questions and concerns, what would you say to them at this point?”
Moore referred back to independent investigation he plans to launch.
“I want to know why it happened, how it happened, and take the necessary steps to ensure that never happens again,” he said.
The Times reviewed text messages among firefighters and a third party that indicated crews had expressed concerns that the Lachman fire would reignite if left unprotected. The exchanges occurred in the weeks and months after the Palisades fire.
In one text message, a firefighter who was at the Lachman scene Jan. 2 wrote that the battalion chief in charge had been told it was a “bad idea” to leave because of visible signs of smoldering terrain, which crews feared could start a new fire.
A second firefighter was told that tree stumps were still hot at the location when the crew packed up and left, according to the texts. And another said in texts last month that crew members were upset when directed to leave the scene, but that they could not ignore orders.
The firefighters’ accounts line up with a video recorded by a hiker above Skull Rock Trailhead late in the morning on Jan. 2 — almost 36 hours after the Lachman fire started — that shows smoke rising from the dirt. “It’s still smoldering,” the hiker says from behind the camera.
A federal grand jury subpoena was served on the LAFD for firefighters’ communications, including text messages, about smoke or hot spots in the area of the Lachman fire, according to an LAFD memo. It is unclear if the subpoena is directly related to the arson case against Jonathan Rinderknecht, who is accused of setting the Jan. 1 fire and has pleaded not guilty.
Complaints that the city and state failed to properly prepare for and respond to the Palisades fire are the subject of numerous lawsuits and a Republican-led inquiry by a U.S. Senate committee.
In addition to the pre-deployment issue, the LAFD’s after-action report found other problems during the Jan. 7 fire fight. The initial dispatch called for only seven engine companies, when the weather conditions required 27. Confusion over which radio channel to use hampered communication. At one point in the first hour, three L.A. County engines showed up requesting an assignment, and received no reply. Another four LAFD engines assembled, but waited 20 minutes without an assignment. In the early afternoon that day, the staging area — where engines were checking in — was overrun by fire.
Moore said he is closely evaluating the 42 recommendations in the report to make sure they are properly implemented.
Bass announced Moore’s selection last month after conducting a nationwide search that included interviews with fire chiefs of other cities. She had ousted Kristen Crowley, who was chief during the Palisades fire, citing deployment decisions ahead of the extreme weather, and appointed interim Fire Chief Ronnie Villanueva in February.
Moore — who said he grew up in the Mar Vista and Venice area — joined the LAFD as a firefighter in 1995, working his way up the ranks in various assignments throughout the city, including supervising arson investigations and serving as a spokesperson for the agency, according to his resume. He most recently was deputy chief of Operations Valley Bureau, directing the response to emergencies across 39 fire stations.
A federal grand jury subpoena has been served on the Los Angeles Fire Department for firefighters’ text messages and other communications about smoke or hot spots in the area of the Jan. 1 Lachman brushfire, which reignited six days later into the massive Palisades fire, according to an internal department memo.
The Times reported last week that a battalion chief ordered firefighters to pack up their hoses and leave the burn area the day after the Lachman fire, even though they complained that the ground was still smoldering and rocks were hot to the touch. In the memo, the department notified its employees of the subpoena, which it said was issued by the U.S. attorney’s office in Los Angeles.
“The subpoena seeks any and all communications, including text messages, related to reports of fire, smoke, or hotspots received between” 10 p.m. on New Year’s Eve and 10 a.m. on Jan. 7, said the memo, which was dated Tuesday.
A spokesperson with the U.S. attorney’s office declined to confirm that a subpoena was issued and otherwise did not comment. The memo did not include a copy of the subpoena.
The memo said the subpoena was issued in connection with an “ongoing criminal investigation” conducted by the Bureau of Alcohol, Tobacco, Firearms and Explosives.
It is unclear from the memo whether the subpoena is directly related to the case against Rinderknecht, who has pleaded not guilty.
During the Rinderknecht investigation, ATF agents concluded that the fire smoldered and burned for days underground “within the root structure of dense vegetation,” until heavy winds caused it to spark the Palisades inferno, according to an affidavit attached to the criminal complaint against Rinderknecht.
The Palisades fire, the most destructive in the city’s history, killed 12 people and destroyed thousands of homes, businesses and other structures.
Last week, The Times cited text messages among firefighters in reporting that crews mopping up the Lachman fire had warned the battalion chief that remnants of the blaze were still smoldering.
The battalion chief listed as being on duty the day firefighters were ordered to leave the Lachman fire, Mario Garcia, has not responded to requests for comment.
In one text message, a firefighter who was at the scene on Jan. 2 wrote that the battalion chief had been told it was a “bad idea” to leave because of the visible signs of smoking terrain, which crews feared could start a new fire if left unprotected.
“And the rest is history,” the firefighter wrote in recent weeks.
A second firefighter was told that tree stumps were still hot at the location when the crew packed up and left, according to the texts. And a third firefighter said this month that crew members were upset when told to pack up and leave but that they could not ignore orders, according to the texts. The third firefighter also wrote that he and his colleagues knew immediately that the Palisades fire was a rekindle of the Jan. 1 blaze.
The Fire Department has not answered questions about the firefighter accounts in the text messages but has previously said that officials did everything they could to ensure that the Lachman fire was fully extinguished. The department has not provided dispatch records of all firefighting and mop-up activity before Jan. 7.
After The Times published the story, Mayor Karen Bass directed interim Fire Chief Ronnie Villanueva to launch an investigation into the matter, while critics of her administration have asked for an independent inquiry.
You may not know Eliot Mack’s name, but if a small robot has ever crept around your kitchen, you know his work.
Before he turned his MIT-trained mind to filmmaking, Mack helped lead a small team of engineers trying to solve a deeply relatable problem: how to avoid vacuuming. Whether it was figuring out how to get around furniture legs or unclog the brushes after a run-in with long hair, Mack designed everything onscreen first with software, troubleshooting virtually and getting 80% of the way there before a single part was ever manufactured.
When Mack pivoted to filmmaking in the early 2000s, he was struck by how chaotic Hollywood’s process felt. “You pitch the script, get the green light and you’re flying into production,” he says, sounding both amused and baffled. “There’s no CAD template, no centralized database. I was like, how do movies even get made?”
That question sent Mack down a new path, trading dust bunnies for the creative bottlenecks that slow Hollywood down.
In 2004 he founded Lightcraft Technology, a startup developing what would later be known as virtual production tools, born out of his belief that if you could design a robot in software, you should be able to design a shot the same way. The company’s early system, Previzion, sold for $180,000 and was used on sci-fi and fantasy shows like “V” and “Once Upon a Time.” But Jetset, its latest AI-assisted tool set, runs on an iPhone and offers a free tier, with pro features topping out at just $80 a month. It lets filmmakers scan a location, drop it into virtual space and block out scenes with camera moves, lighting and characters. They can preview shots, overlay elements and organize footage for editing — all from a phone. No soundstage, no big crew, no gatekeepers. Lightcraft’s pitch: “a movie studio in your pocket.”
A series on how the AI revolution is reshaping the creative foundations of Hollywood — from storytelling and performance to production, labor and power.
The goal, Mack says, is to put more power in the hands of the people making the work. “One of the big problems is how siloed Hollywood is,” he says. “We talked to an Oscar-winning editor who said, ‘I’m never going to get to make my movie’ — he was pigeonholed as just an editor. Same with an animator we know who has two Oscars.”
Eliot Mack, CEO of Lightcraft, an AI-powered virtual-production startup, wants to give creators the power and freedom to bring their ideas to life.
(Christina House/Los Angeles Times)
To Mack, the revolution of Jetset recalls the scrappy, guerrilla spirit of Roger Corman’s low-budget productions, which launched the early careers of directors like Francis Ford Coppola and Martin Scorsese. For generations of creatives stuck waiting on permission or funding, he sees this moment as a reset button.
“The things you got good at — writing, directing, acting, creating, storytelling — they’re still crazy useful,” he says. “What’s changing is the amount of schlepping you have to do before you get to do the fun stuff. Your 20s are a gift. You want to be creating at the absolute speed of sound. We’re trying to get to a place where you don’t have to ask anyone. You can just make the thing.”
AI is reshaping nearly every part of the filmmaking pipeline. Storyboards can now be generated from a script draft. Lighting and camera angles can be tested before anyone touches a piece of gear. Rough cuts, placeholder VFX, even digital costume mock-ups can all be created before the first shot is filmed. What once took a full crew, a soundstage and a six-figure budget can now happen in minutes, sometimes at the hands of a single person with a laptop.
This wave of automation is arriving just as Hollywood is gripped by existential anxiety. The 2023 writers’ and actors’ strikes brought the industry to a standstill and put AI at the center of a fight over its future. Since then, production has slowed, crew sizes have shrunk and the streaming boom has given way to consolidation and cost-cutting.
According to FilmLA, on-location filming in Greater Los Angeles dropped 22.4% in early 2025 compared with the year before. For many of the crew members and craftspeople still competing for those jobs, AI doesn’t feel like an innovation. It feels like a new way to justify doing more with less, only to end up with work that’s less original or creative.
“AI scrapes everything we artists have made off the internet and creates a completely static, banal world that can never imagine anything that hasn’t happened before,” documentary filmmaker Adam Curtis warned during a directors panel at the 2023 Telluride Film Festival, held in the midst of the strikes. “That’s the real weakness of the AI dream — it’s stuck with the ghosts. And I think we’ll get fed up with that.”
How you feel about these changes often depends on where you sit and how far along you are in your career. For people just starting out, AI can offer a way to experiment, move faster and bypass the usual barriers to entry. For veterans behind the scenes, it often feels like a threat to the expertise they’ve spent decades honing.
Past technological shifts — the arrival of sound, the rise of digital cameras, the advancement of CGI — changed how movies were made, but not necessarily who made them. Each wave brought new roles: boom operators and dialogue coaches, color consultants and digital compositors. Innovation usually meant more jobs, not fewer.
But AI doesn’t just change the tools. It threatens to erase the people who once used the old ones.
Diego Mariscal has seen first hand as AI has cut potential jobs for grips.
(Jennifer Rose Clasen)
Diego Mariscal, 43, a veteran dolly grip who has worked on “The Mandalorian” and “Spider-Man: No Way Home,” saw the writing on the wall during a recent shoot. A visual effects supervisor opened his laptop to show off a reel of high-end commercials and something was missing. “There were no blue screens — none,” Mariscal recalls. “That’s what we do. We put up blues as grips. You’d normally hire an extra 10 people and have an extra three days of pre-rigging, setting up all these blue screens. He was like, ‘We don’t need it anymore. I just use AI to clip it out.’”
Mariscal runs Crew Stories, a private Facebook group with nearly 100,000 members, where working crew members share job leads, trade tips and voice their growing fears. He tries to keep up with the steady drip of AI news. “I read about AI all day, every day,” he says. “At least 20 posts a day.”
His fear isn’t just about fewer jobs — it’s about what comes next. “I’ve been doing this since I was 19,” Mariscal says of his specialized dolly work, which involves setting up heavy equipment and guiding the camera smoothly through complex shots. “I can push a cart in a parking lot. I can push a lawnmower. What else can I do?”
Who wins, who loses and what does James Cameron think?
Before AI and digital doubles, Mike Marino learned the craft of transformation the human way: through hands-on work and a fascination that bordered on obsession.
Marino was 5 years old when he first saw “The Elephant Man” on HBO. Horrified yet transfixed, he became fixated on prosthetics and the emotional power they could carry. As a teenager in New York, he pored over issues of Fangoria, studied monsters and makeup effects and experimented with sculpting his own latex masks on his bedroom floor.
Prosthetics artist Mike Marino asks a big question related to generative AI: What role do the human creatives play?
(Sean Dougherty / For The Times)
Decades later, Marino, 48, has become one of Hollywood’s leading makeup artists, earning Oscar nominations for “Coming 2 America,” “The Batman” and last year’s dark comedy “A Different Man,” in which he helped transform Sebastian Stan into a disfigured actor.
His is the kind of tactile, handcrafted work that once seemed irreplaceable. But today AI tools are increasingly capable of achieving similar effects digitally: de-aging actors, altering faces, even generating entire performances. What used to take weeks of experimentation and hours in a makeup trailer can now be approximated with a few prompts and a trained model. To Marino, AI is more than a new set of tools. It’s a fundamental change in what it means to create.
“If AI is so good it can replace a human, then why have any human beings?” he says. “This is about taste. It’s about choice. I’m a human being. I’m an artist. I have my own ideas — mine. Just because you can make 10,000 spaceships in a movie, should you?”
“If AI is so good it can replace a human, then why have any human beings?”
— Mike Marino, makeup artist on “A Different Man”
Marino is no technophobe. His team regularly uses 3D scanning and printing. But he draws the line at outsourcing creative judgment to a machine. “I’m hoping there are artists who want to work with humans and not machines,” he says. “If we let AI just run amok with no taste, no choice, no morality behind it, then we’re gone.”
Not everyone sees AI’s rise in film production as a zero-sum game. Some technologists imagine a middle path. Daniela Rus, director of MIT’s Computer Science and Artificial Intelligence Lab and one of the world’s leading AI researchers, believes the future of filmmaking lies in a “human-machine partnership.”
AI, Rus argues, can take on time-consuming tasks like animating background extras, color correction or previsualizing effects, freeing up people to focus on what requires intuition and taste. “AI can help with the routine work,” she says. “But the human touch and emotional authenticity are essential.”
Few directors have spent more time grappling with the dangers and potential of artificial intelligence than James Cameron. Nearly 40 years before generative tools entered Hollywood’s workflow, he imagined a rogue AI triggering global apocalypse in 1984’s “The Terminator,” giving the world Skynet — now a cultural shorthand for the dark side of machine intelligence. Today, he continues to straddle that line, using AI behind the scenes on the upcoming “Avatar: Fire and Ash” to optimize visual effects and performance-capture, while keeping creative decisions in human hands. The latest sequel, due Dec. 19, promises to push the franchise’s spectacle and scale even further; a newly released trailer reveals volcanic eruptions, aerial battles and a new clan of Na’vi.
A scene from “Avatar: The Way of Water.” Director James Cameron differentiates between using machine-learning to reduce monotonous movie-making work and generative AI.
(Courtesy of 20th Century Studios/Courtesy of 20th Century Studios)
“You can automate a lot of processes that right now tie up a lot of artists doing mundane tasks,” Cameron told The Times in 2023 at a Beyond Fest screening of his 1989 film “The Abyss.” “So if we could accelerate the postproduction pipeline, then we can make more movies. Then those artists will get to do more exciting things.”
For Cameron, the promise of AI lies in efficiency, not elimination. “I think in our particular industry, it’s not going to replace people; it’s going to free them to do other things,” he believes. “It’s going to accelerate the process and bring the price down, which would be good because, you know, some movies are a little more expensive than others. And a lot of that has to do with human energy.”
Cameron himself directed five films between 1984 and 1994 and only three in the three decades since, though each one has grown increasingly complex and ambitious.
That said, Cameron has never been one to chase shortcuts for their own sake. “I think you can make pre-viz and design easier, but I don’t know if it makes it better,” he says. “I mean, if easy is your thing. Easy has never been my thing.”
He draws a line between the machine-learning techniques his team has used since the first “Avatar” to help automate tedious tasks and the newer wave of generative AI models making headlines today.
“The big explosion has been around image-based generative models that use everything from every image that’s ever been created,” he says. “We’d never use any of them. The images we make are computer-created, but they’re not AI-created.”
In his view, nothing synthetic can replace the instincts of a flesh-and-blood artist. “We have human artists that do all the designs,” he says. “We don’t need AI. We’ve got meat-I. And I’m one of the meat-artists that come up with all that stuff. We don’t need a computer. Maybe other people need it. We don’t.”
Reshaping creativity — and creative labor
Rick Carter didn’t go looking for AI as a tool. He discovered it as a lifeline.
The two-time Oscar-winning production designer, who worked with Cameron on “Avatar” and whose credits include “Jurassic Park” and “Forrest Gump,” began experimenting with generative AI tools like Midjourney and Runway during the pandemic, looking for a way to keep his creative instincts sharp while the industry was on pause. A longtime painter, he was drawn to the freedom the programs offered.
“I saw that there was an opportunity to create images where I didn’t have to go to anybody else for approval, which is the way I would paint,” Carter says by phone from Paris. “None of the gatekeeping would matter. I have a whole lot of stories on my own that I’ve tried to get into the world in various ways and suddenly there was a way to visualize them.”
Midjourney and Runway can create richly detailed images — and in Runway’s case, short video clips — from a text prompt or a combination of text and visuals. Trained on billions of images and audiovisual materials scraped from the internet, these systems learn to mimic style, lighting, composition and form, often with eerie precision. In a production pipeline, these tools can help concept artists visualize characters or sets, let directors generate shot ideas or give costume designers and makeup artists a fast way to test looks, long before physical production begins.
But as these tools gain traction in Hollywood, a deeper legal and creative dilemma is coming into focus: Who owns the work they produce? And what about the copyrighted material used to train them?
In June, Disney and Universal filed a federal copyright lawsuit against Midjourney, accusing the company of generating unauthorized replicas of characters such as Spider-Man, Darth Vader and Shrek using AI models trained on copyrighted material: what the suit calls a “bottomless pit of plagiarism.” It’s the most high-profile of several legal challenges now putting copyright law to the test in the age of generative AI.
“Forrest Gump” director Robert Zemeckis, left, with production designer Rick Carter at an art installation of the movie’s famed bench. (Carter family)
(Carter family)
Working with generative models, Carter began crafting what he calls “riffs of consciousness,” embracing AI as a kind of collaborative partner, one he could play off of intuitively. The process reminded him of the loose, improvisational early stages of filmmaking, a space he knows well from decades of working with directors like Robert Zemeckis and Steven Spielberg.
“I’ll just start with a visual or a word prompt and see how it iterates from there and what it triggers in my mind,” Carter says. “Then I incorporate that so it builds on its own in an almost free-associative way. But it’s still based upon my own intuitive, emotional, artistic, even spiritual needs at that moment.”
He describes the experience as a dialogue between two minds, one digital and one human: “One AI is artificial intelligence. The other AI is authentic intelligence — that’s us. We’ve earned it over this whole span of time on the planet.”
Sometimes, Carter says, the most evocative results come from mistakes. While sketching out a story about a hippie detective searching for a missing woman in the Himalayas, he accidentally typed “womb” into ChatGPT instead of “woman.” The AI ran with it, returning three pages of wild plot ideas involving gurus, seekers and a bizarre mystery set in motion by the disappearance.
“I couldn’t believe it,” he says. “I would never have taken it that far. The AI is so precocious. It is trying so much to please that it will literally make something out of the mistake you make.”
Carter hasn’t used generative AI on a film yet; most of his creations are shared only with friends. But he says the technology is already slipping into creative workflows in covert ways. “There are issues with copyrights with most of the studios so for now, it’s going to be mostly underground,” he says. “People will use it but they won’t acknowledge that they’re using it — they’ll have an illustrator do something over it, or take a photo so there’s no digital trail.”
Carter has lived through a major technological shift before. “I remember when we went from analog to digital, from ‘Jurassic Park’ on,” he says. “There were a lot of wonderful artists who could draw and paint in ways that were just fantastic but they couldn’t adapt. They didn’t want to — even the idea of it felt like the wrong way to make art. And, of course, most of them suffered because they didn’t make it from the Rolodex to the database in terms of people calling them up.”
He worries that some artists may approach the technology with a rigid sense of authorship. “Early on, I found that the less I used my own ego as a barometer for whether something was artistic, the more I leaned into the process of collaboratively making something bigger than the sum of its parts — and the bigger and better the movies became.”
Others, like storyboard artist Sam Tung, are bracing against the same wave with a quiet but unshakable defiance.
Tung, whose credits include “Twisters” and Christopher Nolan’s upcoming adaptation of “The Odyssey,” has spent the last two years tracking the rise of generative tools, not just their capabilities but their implications. As co-chair of the Animation Guild’s AI Committee, he has been on the front lines of conversations about how these technologies could reshape creative labor.
To artists like Tung, the rise of generative tools feels deeply personal. “If you are an illustrator or a writer or whatever, you had to give up other things to take time to develop those skills,” he says. “Nobody comes out of the womb being able to draw or write or act. Anybody who does that professionally spent years honing those skills.”
“Anything I’ve made with AI, I’ve quickly forgotten about. There’s basically nothing I get from putting it on social media, other than the ire of my peers.”
— Sam Tung, storyboard artist on “The Odyssey”
Tung has no interest in handing that over to a machine. “It’s not that I’m scared of it — I just don’t need it,” he says. “If I want to draw something or paint something, I’ll do it myself. That way it’s exactly what I want and I actually enjoy the process. When people tell me they responded to a drawing I did or a short film I made with friends, it feels great. But anything I’ve made with AI, I’ve quickly forgotten about. There’s basically nothing I get from putting it on social media, other than the ire of my peers.”
What unsettles him isn’t just the slickness of AI’s output but how that polish is being used to justify smaller crews and faster turnarounds. “If this is left unchecked, it’s very easy to imagine a worst-case scenario where team sizes and contract durations shrink,” Tung says. “A producer who barely understands how it works might say, ‘Don’t you have AI to do 70% of this? Why do you need a whole week to turn around a sequence? Just press the button that says: MAKE MOVIE.’ ”
At 73, Carter isn’t chasing jobs. His legacy is secure. “If they don’t hire me again, that’s OK,” he says. “I’m not in that game anymore.” He grew up in Hollywood — his father was Jack Lemmon’s longtime publicist and producing partner — and has spent his life watching the industry evolve. Now, he’s witnessing a reckoning unlike any he, or anyone else, has ever imagined.
“I do have concerns about who is developing AI and what their values are,” he says. “What they use all this for is not necessarily something I would approve of — politically, socially, emotionally. But I don’t think I’m in a position to approve or not.”
Earlier this year, the Palisades fire destroyed Carter’s home, taking with it years of paintings and personal artwork. AI, he says, has given him a way to keep creating through the upheaval. “It saved me through the pandemic, and now it’s saving me through the fire,” he says, as if daring the universe to test him again. “It’s like, go ahead, throw something else at me.”
‘Prompt and pray?’ Not so fast
Many in the industry may still be dipping a toe into the waters of AI. Verena Puhm dove in.
The Austrian-born filmmaker studied acting and directing in Munich and Salzburg before moving to Los Angeles, where she built a globe-spanning career producing, writing and developing content for international networks and streamers. Her credits range from CNN’s docuseries “History of the Sitcom” to the German reboot of the cult anthology “Beyond Belief: Fact or Fiction” and a naval documentary available on Tubi. More recently, she has channeled that same creative range into a deepening exploration of generative tools.
Puhm first began dabbling with AI while using Midjourney to design a pitch deck, but it wasn’t until she entered a timed generative AI filmmaking challenge at the 2024 AI on the Lot conference — informally dubbed a “gen battle” — that the creative potential of the medium hit her.
“In two hours, I made a little mock commercial,” she remembers, proudly. “It was actually pretty well received and fun. And I was like, Oh, wow, I did this in two hours. What could I do in two days or two weeks?”
What started as experimentation soon became a second act. This summer, Puhm was named head of studio for Dream Lab LA, a new creative arm of Luma AI, which develops generative video tools for filmmakers and creators. There, she’s helping shape new storytelling formats and supporting emerging creators working at the intersection of cinema and technology. She may not be a household name, but in the world of experimental storytelling, she’s fast becoming a key figure.
Verena Puhm, a director, writer and producer, has used generative AI in a number of her projects, says it’s breaking down barriers to entry.
(Jason Armond/Los Angeles Times)
Some critics dismiss AI filmmaking as little more than “prompt and pray”: typing in a few words and hoping something usable comes out. Puhm bristles at the phrase.
“Anybody that says that tells me they’ve never tried it at all, because it is not that easy and simple,” she says. “You can buy a paintbrush at Home Depot for, what, $2? That doesn’t make you a painter. When smartphones first came out, there was a lot of content being made but that didn’t mean everyone was a filmmaker.”
What excites her most is how AI is breaking down the barriers that once kept ambitious ideas out of reach. Luma’s new Modify Video tool lets filmmakers tweak footage after it’s shot, changing wardrobe, aging a character, shifting the time of day, all without reshoots or traditional VFX. It can turn a garage into a spaceship, swap a cloudy sky for the aurora borealis or morph an actor into a six-eyed alien, no green screen required.
“I remember shopping projects around and being told by producers, ‘This scene has to go, that has to go,’ just to keep the budget low. Now everything is open.”
— Verena Puhm, Head of Studio at Dream Lab LA
“It’s such a relief as an artist,” Puhm says. “If there’s a project I’ve been sitting on for six years because I didn’t have a $5 million budget — suddenly there’s no limit. I remember shopping projects around and being told by producers, ‘This scene has to go, that has to go,’ just to keep the budget low. Now everything is open.”
That sense of access resonates far beyond Los Angeles. At a panel during AI on the Lot, “Blue Beetle” director Ángel Manuel Soto reflected on how transformative AI might have been when he was first starting out. “I wish tools like this existed when I wanted to make movies in Puerto Rico, because nobody would lend me a camera,” he said. “Access to equipment is a privilege we sometimes take for granted. I see this helping kids like me from the projects tell stories without going bankrupt — or stealing, which I don’t condone.”
Puhm welcomes criticism of AI but only when it’s informed. “If you hate AI and you’ve actually tested the tools and educated yourself, I’ll be your biggest supporter,” she says. “But if you’re just speaking out of fear, with no understanding, then what are you even basing your opinion on?”
She understands why some filmmakers feel rattled, especially those who, like her, grew up dreaming of seeing their work on the big screen. “I still want to make features and TV series — that’s what I set out to do,” she says. “I hope movie theaters don’t go away. But if the same story I want to tell reaches millions of people on a phone and they’re excited about it, will I really care that it wasn’t in a theater?”
“I just feel like we have to adapt to the reality of things,” she continues. “That might sometimes be uncomfortable, but there is so much opportunity if you lean in. Right now any filmmaker can suddenly tell a story at a high production value that they could have never done before, and that is beautiful and empowering.”
For many, embracing AI boils down to a simple choice: adapt or get cut from the frame.
Hal Watmough, a BAFTA-winning British editor with two decades of experience, first began experimenting with AI out of a mix of curiosity and dread. “I was scared,” he admits. “This thing was coming into the industry and threatening our jobs and was going to make us obsolete.” But once he started playing with tools like Midjourney and Runway, he quickly saw how they could not only speed up the process but allow him to rethink what his career could be.
For an editor used to working only with what he was given, the ability to generate footage on the fly, cut with it immediately and experiment endlessly without waiting on a crew or a shoot was a revelation. “It was still pretty janky at that stage, but I could see the potential,” he says. “It was kind of intoxicating. I started to think, I’d like to start making things that I haven’t seen before.”
After honing his skills with various AI tools, Watmough created a wistful, vibrant five-minute animated short called “LATE,” about an aging artist passing his wisdom to a young office worker. Over two weeks, he generated 2,181 images using AI, then curated and refined them frame by frame to shape the story.
Earlier this year, he submitted “LATE” to what was billed as the world’s first AI animation contest, hosted by Curious Refuge, an online education hub for creative technologists — and, to his delight, he won. The prize included $10,000, a pitch meeting with production company Promise Studios and, as an absurd bonus, his face printed on a potato. But for Watmough, the real reward was the sense that he had found a new creative identity.
“There’s something to the fact that the winner of the first AI animation competition was an editor,” Watmough says. “With the advent of AI, yes, you could call yourself a filmmaker but essentially I’d say most people are editors. You’re curating, selecting, picking what you like — relying on your taste.”
Thanks to AI, he says he’s made more personal passion projects in the past year and a half than during his entire previous career. “I’ll be walking or running and ideas just come. Now I can go home that night and try them,” he says. “None of that would exist without AI. So either something exists within AI or it never exists at all. And all the happiness and fulfillment that comes with it for the creator doesn’t exist either.”
Watmough hasn’t entirely lost his fear of what AI might do to the creative workforce, even as he is energized by what it makes possible. “A lot of people I speak to in film and TV are worried about losing their jobs and I’m not saying the infrastructure roles won’t radically change,” he says. “But I don’t think AI is going to replace that many — if any — creative people.”
What it will do, he says, is raise the bar. “If anyone can create anything, then average work will basically become extinct or pointless. AI can churn out remakes until the cows come home. You’ll have to pioneer to exist.”
He likens the current moment to the birth of cinema more than a century ago — specifically the Lumière brothers’ “Arrival of a Train at La Ciotat,” the 1896 short that famously startled early audiences. In the silent one-minute film, a steam train rumbles toward the camera, growing larger. Some viewers reportedly leaped from their seats, convinced it was about to crash into them.
“People ran out of the theater screaming,” Watmough says. “Now we don’t even think about it. With AI, we’re at that stage again. We’re watching the steam train come into the station and people are either really excited or they’re running out of the theater in fear. That’s where we are, right at the start. And the potential is limitless.”
Then again, he adds with a dry laugh, “I’m an eternal optimist, so take what I say with a grain of salt.”