Intro: A major upgrade for image editing
Google is expanding its Nano Banana image-editing model beyond its experiment phase, bringing the tool to search, Google Photos, and NotebookLM. After a summer rollout in Gemini 2.5 Flash, Nano Banana aims to let users modify images with natural language prompts. The move signals Google’s push to integrate conversational AI deeply into everyday photo tasks and information search.
Where Nano Banana lands: Lens, Search, and Notes
Lens and AI Mode in Google Search
In the Lens app, users will eventually see a Create button with a banana icon at the bottom. Tapping it opens an AI prompt where you can describe how you want a photo changed. The resulting edits appear in-line within the app, and you can approve or request follow-up edits through the AI Mode interface. Google is also enabling a separate path to Nano Banana via the generic Create image tool in search, allowing conversational editing directly from the search results. This makes Nano Banana accessible without opening Gemini or dedicated editors.
NotebookLM: video and style options powered by Nano Banana
NotebookLM enhances its AI-generated video summaries with a new Nano Banana-driven set of video styles. In addition to the existing explainer and classic formats, users can choose styles like whiteboard, anime, retro print, and more. A new “Brief” option adds another format for quick, to-the-point videos. While style consistency improves, the output is still subject to generative AI limitations, meaning prompts guide the result rather than guaranteeing a perfect edit.
Google Photos: coming soon, with a major upgrade
Google has teased that the Nano Banana image editor will arrive in Google Photos in the coming weeks, though a firm rollout timeline isn’t public yet. The integration is described as a “major upgrade” over the previous image-editing model, building on the conversational editing seen in Photos last month. For casual photo edits—like retouching, color adjustments, or creative edits—the Nano Banana feature could dramatically reduce friction and frustration.
Why this matters: convenience, control, and AI capabilities
By embedding Nano Banana across search, Photos, and NotebookLM, Google is creating a cohesive ecosystem where a simple prompt can alter how you find, present, and summarize visual content. The approach aligns with broader AI goals: make complex tasks feel intuitive and conversational. As always with AI-powered editing, users should be mindful of originality and attribution when modifying images, especially in professional or public contexts.
Looking ahead: what users can expect
Early testers should anticipate smoother, more natural prompts, faster edits, and broader coverage as Nano Banana scales across apps. The updates also emphasize Google’s intent to deepen engagement with its conversational search bot, offering a unified experience across Lens, search, and workspace tools like NotebookLM. For everyday users, the combined capability means fewer clicks and more creative control when tailoring images to fit needs—whether for research, presentation, or social sharing.