Benefits which MNCs are getting from AI/ML and how they are using it to better their products.

1. Apple
Apple’s AI and ML chief in an interview said-
“There’s a whole bunch of new experiences that are powered by machine learning. And these are things like language translation, or on-device dictation, or our new features around health, like sleep and hand washing, and stuff we’ve released in the past around heart health and things like this. I think there are increasingly fewer and fewer places in iOS where we’re not using machine learning.
It’s hard to find a part of the experience where you’re not doing some predicative work. Like, app predictions, or keyboard predictions, or modern smartphone cameras do a ton of machine learning behind the scenes to figure out what they call “saliency,” which is like, what’s the most important part of the picture? Or, if you imagine doing blurring of the background, you’re doing portrait mode.
Savvy iPhone owners might also notice that machine learning is behind the Photos app’s ability to automatically sort pictures into pre-made galleries, or to accurately give you photos of a friend named Jane when her name is entered into the app’s search field.
All of these things benefit from the core machine learning features that are built into the core Apple platform. So, it’s almost like, “Find me something where we’re not using machine learning.”
I understand this perception of bigger models in data centres somehow are more accurate, but it’s actually wrong. It’s actually technically wrong. It’s better to run the model close to the data, rather than moving the data around.”
Apple is seeing a huge future in AI as far as the apple products are concerned, they are already embedded with some great AI and ML model which are giving good experiences to the users Siri is one of the major examples how apple is enriching the customer experience day by day.

2.Google
Google detailed the ways it’s applying AI and machine learning to improve the Google Search experience.
Google says users will soon be able to see how busy places are in Google Maps without searching for specific beaches, grocery stores, pharmacies, or other locations, an expansion of Google’s existing busyness metrics. The company also says it’s adding COVID-19 safety information to businesses’ profiles across Search and Maps, revealing whether they’re using safety precautions like temperature checks, plexiglass shields, and more.
“This single change makes a greater improvement to spelling than all of our improvements over the last five years,” Google head of search Prabhakar Raghavan said in a blog post. Google says it can now index individual passages from webpages, as opposed to whole pages. When this rolls out fully, Google claims it will improve roughly 7% of search queries across all languages. A complementary AI component will help Search capture the nuances of webpage content, ostensibly leading to a wider range of results for search queries.
“We’ve applied neural nets to understand subtopics around an interest, which helps deliver a greater diversity of content when you search for something broad,” Raghavan continued. “As an example, if you search for ‘home exercise equipment,’ we can now understand relevant subtopics, such as budget equipment, premium picks, or small space ideas, and show a wider range of content for you on the search results page.”
Google is also bringing Data Commons — its open knowledge repository that combines data from public datasets (e.g., COVID-19 stats from the U.S. Centers for Disease Control and Prevention) using mapped common entities — to search results on the web and mobile. In the near future, users will be able to search for topics like “employment in Chicago” on Search to see information in context.
In another addition to Search, Google says it will deploy a feature that highlights notable points in videos — for example, a screenshot comparing different products or a key step in a recipe. Google expects 10% of searches will use this technology by the end of 2020. And Live View in Maps, a tool that taps AR to provide turn-by-turn walking directions, will enable users to quickly see information about restaurants, including how busy they tend to be and their star ratings.
Google says melodies hummed in Search are transformed by machine learning algorithms into number-based sequences. The models are trained to identify songs based on a variety of sources, including humans singing, whistling, or humming, as well as studio recordings. The algorithms also abstract away all the other details, like accompanying instruments and the voice’s timbre and tone. What remains is a fingerprint Google compares with thousands of songs from around the world to identify potential matches in real time, much like the Pixel’s Now Playing feature.