Daily Technology
·14/02/2026
Artificial intelligence on phones is moving from waiting for orders to offering help before the user asks. Samsung besides Google, the two largest forces in Android, pursue this goal in different ways. Samsung builds a feature called “Now Nudge” that relies on Galaxy AI, while Google keeps improving its set of context aware tools. A close look at each shows two separate plans for making the phone feel simpler to use.
Early code from the upcoming One UI 9.0 firmware shows that “Now Nudge” acts as an assistant that steps in on its own. The software reads what appears on the screen at that moment and then shows a prompt for a useful next step. If a person fills out a form, the assistant supplies possible entries for each blank. When a chat mentions a possible meeting, a button appears that opens the calendar, removing the need to leave the current app.
Test builds reveal support for many practical jobs - hailing a scooter or bike rental, pulling up flight details, reaching government services, fetching location facts, showing medical data and reserving cinema seats. The design aims to finish entire jobs instead of only offering data. The public first expected the tool in One UI 8.5 but current leaks place the full release inside One UI 9.0, giving future Galaxy phones a headline feature.
Google places its AI inside the interface and usually waits for the user to start the process. “Circle to Search” lets a person draw around any part of the display to launch a visual search and the Gemini assistant opens as an overlay when the user requests it. This pattern gives the user control over the timing and method of each request.
The “At a Glance” widget on Pixel phones supplies a limited form of proactive data. It lists calendar events, weather warnings and boarding passes on the home and lock screens but its job is to display facts rather than perform tasks. Google gains power from deep links to search and the knowledge graph, offering wide ranging answers whenever the user calls for them.
The key difference lies in who starts the action - now Nudge uses a “push” design - the assistant surfaces a card without user input. Google's main tools follow a “pull” model - the user presses, circles or opens an app before the AI reacts.
Now Nudge targets specific multi step chores like bookings and paperwork. Google's reach is wider, covering object recognition, on screen translation and open ended web search. Each path marks a clear advance toward smarter phones but they rest on opposing views of the ideal assistant - one acts as an even guide, the other as a potent tool summoned at will.









