Site icon aivancity blog

An AI that lives alongside you: Razer introduces Project Ava, its holographic companion

At CES, some innovations stand out for their technological prowess, while others do so for the cultural shift they’re setting in motion. With Project Ava, Razer clearly falls into the latter category. By offering a holographic desktop companion, the brand isn’t merely adding a feature to existing artificial intelligence. It’s redefining the relationship between humans and machines. AI is no longer activated solely on demand; it shares the space, observes, waits, and accompanies. A quiet but profound transformation of personal computing.

Over the past decade, digital assistants have accustomed users to interacting with intelligent systems. But these interactions remain sporadic, triggered by an explicit command or request. Project Ava offers something different. An embodied, visible AI, endowed with a form and a continuous presence. This evolution is part of a trend documented by research in human-machine interaction: users develop higher expectations of systems they perceive as “present” rather than purely functional1. AI is no longer just a service; it is becoming a part of everyday life.

Technically, Project Ava relies on a combination of sensors and AI models designed to capture context rather than just commands. An HD camera, eye tracking, long-range microphones, and screen access via “PC Vision” mode enable the AI to understand what the user is doing without the user having to verbalize everything. This multimodal analysis capability brings Project Ava closer to so-called ambient AI systems, capable of reasoning based on weak and continuous signals2. The challenge is no longer providing a one-off response, but rather a comprehensive interpretation of a situation.

Razer highlights the versatility of its holographic companion. With real-time tips to improve gaming performance, smart reminders, calendar management, and personalized suggestions, Project Ava sits at the intersection of entertainment and productivity. This choice reflects a broader shift in digital usage. According to an MIT study, more than 70% of users now expect digital tools to adapt to their multiple contexts rather than compartmentalizing work and leisure3. Ava embodies this continuity, for better or for worse.

The AI’s access to on-screen content is one of the most significant innovations in Project Ava. By directly understanding what the user is viewing or interacting with, the AI can provide assistance with unprecedented precision. But this level of proximity immediately raises trust concerns. Razer asserts that data processing occurs locally, without systematic transmission to the cloud. This promise comes at a time when the location of processing has become a key criterion for the acceptability of AI systems4. It remains to be seen how this architecture will hold up under prolonged and complex use.

The holographic embodiment of Project Ava is not merely an aesthetic choice. It is part of a cognitive design approach. By giving the AI a face and a form, Razer facilitates interaction while also increasing emotional engagement. Research by Reeves and Nass has shown that humans tend to attribute intentions and emotions to artificial agents as soon as they exhibit anthropomorphic characteristics5. This projection can enhance the effectiveness of the assistance, but it can also blur the line between tool and relationship.

Embedded AI represents an emerging yet strategic market. Research firms estimate that AI-enhanced personal assistants could be worth more than $90 billion by 20286. However, this potential comes with a high dropout rate for products perceived as intrusive or anxiety-inducing. Project Ava therefore walks a fine line between desirable innovation and the risk of social rejection.

The introduction of a holographic companion capable of continuously seeing, hearing, and analyzing raises major ethical questions. First and foremost is the issue of consented surveillance. Even if the user explicitly accepts the presence of AI, the normalization of constant algorithmic surveillance can alter behaviors and social norms7. Next is the issue of cognitive autonomy. An AI that is too proactive can steer decisions and routines without leaving the user the necessary space for critical reflection. Finally, emotional attachment to an artificial agent raises questions about the responsibility of designers, particularly with regard to young or vulnerable audiences.

Beyond Razer, Project Ava signals a broader shift. Artificial intelligence is no longer content with simply being high-performing; it now seeks to be acceptable, livable, almost familiar. This quest for embodiment marks a new phase in AI, where success will be measured not only by precision or computational power, but by the ability to coexist sustainably with humans. Fascinating, unsettling, or both, this companion AI raises a central question: how far are we willing to let artificial intelligence live alongside us?

The emergence of embodied AI companions like Project Ava raises broader questions about how artificial intelligence is making its way into our living and working spaces, beyond the confines of our screens. To explore this discussion further regarding the everyday uses of AI, their impact on our habits, and our relationship with technology, we invite you to read our analysis on the evolution of digital practices in the era of widespread AI: We Live with AI as a New Habit: A Look Back at a Pivotal Year.

1. Reeves, B., Nass, C. (2023). The Media Equation Revisited: Human Responses to Artificial Agents.
https://www.cambridge.org

2. Weiser, M. (2024). The Computer for the 21st Century, revisited. https://www.scientificamerican.com

3. MIT Media Lab. (2024). Context-aware computing and human expectations.
https://www.media.mit.edu

4. Stanford HAI. (2025). Trust and adoption in AI-driven interfaces.
https://hai.stanford.edu

5. Reeves, B., Nass, C. (1996, updated 2023). The Media Equation.
https://www.cambridge.org

6. McKinsey Global Institute. (2025). The economic potential of personal AI assistants.
https://www.mckinsey.com /a>

7. OECD. (2024). Ethical challenges of AI companions and ambient intelligence.
https://www.oecd.org

Quitter la version mobile