Apple AI Glasses Expected To Be Launched This Year

Posted by Kirhat | Monday, March 16, 2026 | | 0 comments »

Apple AI Glasses
As they try to fulfill their long-overdue wish of tech enthusiasts, Apple is reportedly gearing up to launch its AI glasses in 2026. With tech giants like Meta and Google also entering the smart wearables market, Apple’s entry has triggered excitement among users and analysts alike. Speculation over the tech company giving Meta’s Ray-Ban glasses a run for their money is generating buzz across social media.

Apple is set to challenge competitors Meta and Google. They are preparing to launch their AI-powered smart glasses in 2026. The glasses are to feature an in-built Siri equipped to perform tasks like making phone calls, navigation, and music control.

As per 9to5Mac, citing sources, Apple’s highly anticipated smart glasses are set to make their debut in 2026.

However, its launch window could stretch into 2027. These AI glasses, armed with cutting-edge technology, allegedly boast an array of innovative features. With a built-in camera and intuitive voice control via Siri, users are in for a game-changing experience.

Moreover, health-tracking features can be utilised for monitoring biometrics and tracking fitness activities. The rumored seamless integration with iPhones promises enhanced functionality.

9to5Mac hints that the Apple Glasses’ features will scream innovation. The Apple AI Glasses come with an Apple Watch S-class chip, boosting power efficiency and enabling advanced visual features. This chip will enable multiple cameras to spring to life, unlocking features that rival the iPhone’s prowess.

The initial Apple Glasses model is rumored to skip the display feature, prioritizing core AI-powered functionalities instead. Meanwhile, the premium variant is in the works, promising a display that will take the wearable experience to new heights. As per reports, the Apple AI glasses will be available in multiple frame styles and color options, giving users more choice in design and personalization. They might also come with prescription lenses, making them accessible to users who require vision correction.

Read More ...

Apple Has A New Feature Installed In iPhones

Posted by Kirhat | Thursday, January 08, 2026 | | 0 comments »

iPhone New Feature
Apple recently released iOS 26.2, which introduces several new abilities to current iPhones.

Admittedly, the description for the updated firmware is vague. Apple explains that it "includes enhancements to Apple Music, Podcasts, and Games, as well as other features, bug fixes, and security updates for your iPhone."

However, there’s quite a bit more going on here. Notably, the firmware gives users more freedom to tweak and customize Liquid Glass settings, as well as adds alarms to the Reminders app.

There’s also a new Accessibility setting that makes it easier to tell when the user have received a notification.

Many probably didn’t know it, but the iPhone has long had an Accessibility setting called "Flash for Alerts" buried in the Settings app. When enabled, the rear camera’s flash turns on when the user receive a notification.

With iOS 26.2, Apple has added a new option within Flash for Alerts that lets user's iPhone front screen flash as well, giving them more ways to know when they have received a notification.

To enable:

  1. Open the Settings app on your iPhone.
  2. Select Accessibility (beneath General).
  3. Select Audio & Visual (in the Hearing section).
  4. Scroll to the bottom and select Flash for Alerts.
  5. Toggle Flash for Alerts on, and select the new Screen option.
Once there, users can simply toggle on Flash for Alerts and select the new Screen option. Or they can select Both, in which case their iPhone’s rear camera flash and front display will light up when a notification arrives.

Read More ...

New Model Helps Humanoid Robots Adapt More

Posted by Kirhat | Wednesday, December 17, 2025 | | 0 comments »

Humanoid Robot
Christopher McFadden of Interesting Engineering reported that researchers from Wuhan University have recently developed a new framework that could help robots manipulate objects more easily. Introduced in a new paper on arXiv, this approach should enable humanoid robots to grasp and handle a greater variety of objects than is currently possible.

At present, humanoid robots are great at tasks like using tools, grasping, and walking, but they suffer from inherent limitations. In most cases, they can fail tasks when an object changes shape or when lighting changes.

They can also struggle completing tasks the robot hasn’t been specifically trained to do. It is this lack of generalization that is widely seen as one of the technology’s major limitations.

To help overcome this, the Wuhan team set out to develop what it calls the recurrent geometric-prior multimodal policy, RGMP for short. This framework is designed to help humanoid robots have a kind of in-built common sense about things like shapes and space.

It also provides robots with a means to better select required skills for a task, and a more data-efficient way to learn movement patterns.

The goal of it, ultimately, is to help robots pick the right action and adapt in new environments with far less training data than before. According to the team, RGMP consists of two main key parts.

The first is called the Geometric-Prior Skill Selector (GSS), which helps the robot decide which of its "tools" and skills is best suited to a task. Using things like its cameras, the robot can use GSS to work out an object’s shape, size, and orientation.

With this information in hand (so to speak), the robot can then work out what needs to be done to complete a given task (i.e, pick up, push, grip, hold with two hands, etc.).

The second is called Adaptive Recursive Gaussian Network (ARGN). Once the robot picks a skill, the ARGN helps the robot actually perform the task. It achieves this by modelling spatial relationships between the robot and the object.

It can also help predict movements step-by-step, and is extremely data-efficient (needs far fewer training examples than typical deep learning methods).

This combination of ARGN and GSS helps robots better complete tasks without needing thousands of demonstrations and training. In testing, robots using the framework were able to achieve an impressive 87 percent success rate in novel tasks that the robots had no experience in completing.

The team also found that the framework is around 5 times more data-efficient than current diffusion-policy-based models (which are currently state-of-the-art). This is impressive and could be very important in the future.

If robots can reliably manipulate objects without being retrained for each new situation, they can actually be used in tasks like helping around the home to clean, tidy, and perhaps even cook.

Read More ...

Apple Is Facing Key Leadership Shakeup

Posted by Kirhat | Tuesday, December 16, 2025 | | 0 comments »

Apple Management
It was reported by Fortune that tech giant Apple is currently undergoing the most extensive executive overhaul in recent history, with a wave of senior leadership departures that marks the company’s most significant management realignment since its visionary co-founder and CEO Steve Jobs died in 2011.

The leadership exodus spans critical divisions from artificial intelligence to design, legal affairs, environmental policy, and operations, which will have major repercussions for Apple’s direction for the foreseeable future.

Last 4 December, Apple announced Lisa Jackson, its VP of environment, policy, and social initiatives, as well as Kate Adams, the company’s general counsel, will both retire in 2026. Adams has been Apple’s chief legal officer since 2017, and Jackson joined Apple in 2013. Adams will step down late next year, while Jackson will leave next month.

Jackson and Adams join a growing list of top executives who have either left or announced their exits this year. AI chief John Giannandrea announced his retirement earlier this month, and its design lead Alan Dye, who took charge of Apple’s all-important user interface design after Jony Ive left the company in 2019, was just poached by Mark Zuckerberg’s Meta this week.

The scope of the turnover is unprecedented in the Tim Cook era. In July, Jeff Williams, Apple’s COO who was long thought to succeed Cook as CEO, decided to retire after 27 years with the company. One month later, Apple’s CFO Luca Maestri also decided to step back from his role. And the design division, which just lost Dye, also lost Billy Sorrentino, a senior design director, who left for Meta with Dye.

Things have been particularly turbulent for Apple’s AI team, though: Ruoming Pang, who headed its AI Foundation Models Team, left for Meta in July and took about 100 engineers with him. Ke Yang, who led AI-driven web search for Siri, and Jian Zhang, Apple’s AI robotics lead, also both left for Meta.

While all of these departures are a big deal for Apple, the timing may not be a coincidence. Both Bloomberg and the Financial Times have reported on Apple ramping up its succession plan efforts in preparation for Cook, who has led the company since 2011, to retire in 2026.

Cook turned 65 in November and has grown Apple’s market cap from about US$ 350 billion to a whopping US$ 4 trillion under his tenure. Bloomberg reports John Ternus has emerged as the leading internal candidate to replace him.

Read More ...

McDo AI Ad Labeled As "Cold" And "Emotionless"

Posted by Kirhat | Thursday, December 11, 2025 | | 0 comments »

McDo AI Ad
The general consensus shows that it is not appealing and the public made sure that it gets cancelled. A recent McDonald’s Christmas advertisement entirely generated by AI has faced public backlash, leading to the video being delisted from YouTube.

Reportedly, the ad was created for the fast-food giant’s Netherlands division by the ad agency TBWA\Neboko and the production house The Sweetshop.

The 45-second spot revolved around the theme that the holiday season is the "most terrible time of the year."

It was labeled "cold" and emotionless by viewers who decried its low quality and the use of AI rather than human artists.

The advertisement was produced with the cynical idea that the holiday season is the "most terrible time of the year," thereby presenting McDonald's as a peaceful sanctuary free from seasonal chaos.

It depicts AI-generated individuals suffering through various common winter activities that go wrong, such as stressful family dinners, chaotic shopping, caroling, botched cookie baking, and disastrous Christmas tree decorating.

The commercial ends with saying: "Hide out in McDonald’s until January’s here."

Viewers criticized both the quality and the message of the advertisement.

The AI-generated McDonald’s ad was visually jarring with rapidly changing scenes that complicated the viewing experience.

Futurism reported that this technique is often used in AI video because the technology tends to lose visual continuity after only a few seconds.

The advertisement’s characteristic AI flaws created an unsettling "uncanny valley" effect, making the clip immediately become the source of viewer dissatisfaction.

The ad, posted earlier on YouTube, generated a modest 20,000 views.

It prompted a flood of negative comments, leading McDonald’s to first disable the comment section for the weekend and then completely remove the video.

Read More ...

Insect-Style Robot Pulled Off Difficult Maneuvers

Posted by Kirhat | Saturday, December 06, 2025 | | 0 comments »

Insect Robots
If the report of Aamir Khollam from Interesting Engineering were true, then the tiny robotic insects may soon become lifesaving tools in disaster zones. The report further stated that MITT researchers have unveiled an aerial microrobot that flies with unprecedented speed and agility, mirroring the gymnastic motion of real insects.

In the future, these miniature flying machines could navigate collapsed buildings after earthquakes and help locate survivors in places larger robots cannot reach.

The breakthrough marks a significant shift in micro-robotics, where flight stability and speed have historically lagged far behind nature’s engineering.

Earlier versions of insect-scale robots could only fly slowly and along predictable paths. The new robot changes that dynamic entirely.

Roughly the size of a microcassette and lighter than a paperclip, the machine uses soft artificial muscles that power its large flapping wings at high frequency.

The updated hardware enables tight turns, rapid acceleration, and aerial tricks that resemble insect maneuverability.

But hardware alone wasn’t enough. The robot needed a smarter and faster "brain."

That came in the form of a new AI-based controller that interprets the robot’s position and environment, then decides how it should move in real time.

Previous control systems required manual tuning by engineers, which limited performance and didn’t scale for complex movement.

Kevin Chen, associate professor in MIT’s Department of Electrical Engineering and Computer Science, explains the goal clearly – "We want to be able to use these robots in scenarios that more traditional quad copter robots would have trouble flying into, but that insects could navigate."

He adds, "Now, with our bioinspired control framework, the flight performance of our robot is comparable to insects in terms of speed, acceleration, and the pitching angle. This is quite an exciting step toward that future goal."

Read More ...