Connect with us

Artificial Intelligence

ChatGPT Now Lets You Name Someone to Check In If Things Get Dark

Published

on

AI chatbots have transformed how we discuss personal topics, including some of life’s most difficult moments. But this openness comes with responsibility. OpenAI is stepping up with a new feature called ChatGPT Trusted Contact, designed to bring a human into the loop when conversations take a serious turn.

Rolling out now for adult users, this optional setting lets you designate one person who can be alerted if the AI detects potential self-harm concerns. It’s a proactive move that blends technology with human oversight.

How Does ChatGPT Trusted Contact Work?

Setting up a ChatGPT Trusted Contact is straightforward but comes with clear rules. The person you choose must be at least 18 years old—or 19 in South Korea. Once you nominate someone, they receive an invitation explaining their role. They have one week to accept before the feature activates. If they decline, you can pick another contact.

The alert process isn’t automatic. When ChatGPT’s systems flag a conversation as concerning, the chatbot first informs you that your contact may be notified. It also suggests conversation starters to help you reach out directly. A small team of specially trained human reviewers then evaluates the situation. Only if they confirm a serious risk does your contact get notified—via email, text, or in-app alert.

What Information Is Shared?

Importantly, the alert doesn’t share chat transcripts or details. It simply states that self-harm came up in a potentially concerning way and asks the contact to check in. OpenAI aims to complete this human review within one hour, ensuring timely support without compromising privacy.

Why Is OpenAI Adding This Now?

This feature builds on earlier safety measures. Previously, OpenAI introduced alerts for parents when linked teen accounts show distress. ChatGPT Trusted Contact extends that protection to adults. It was developed with input from clinicians, researchers, and mental health organizations, including the American Psychological Association.

However, this feature isn’t a replacement for professional help. ChatGPT will still direct users to crisis hotlines and emergency services when needed. You can remove or change your trusted contact anytime, and contacts can opt out whenever they wish.

As AI becomes a confidant for many, ChatGPT Trusted Contact acknowledges that technology has limits. It’s a step toward blending digital support with real human connection. For more on AI safety, check out our guide on AI safety tips for users.

What This Means for Users

The reality is that people use ChatGPT for deeply personal conversations, whether OpenAI planned for it or not. Adding a feature like this is a move in the right direction. It also admits that a chatbot can only do so much.

If you’re considering setting up a trusted contact, remember it’s optional but potentially life-saving. For more on mental health resources, visit our mental health support page.

In summary, ChatGPT Trusted Contact represents a thoughtful evolution in AI safety. It combines automated detection with human judgment, offering a safety net without overstepping privacy boundaries.

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Artificial Intelligence

Perplexity’s Personal Computer Can Work Autonomously on Your Mac, and It’s Now Available to All

Published

on

Perplexity Personal Computer Mac App Goes Wide: Meet the Autonomous AI Agent for Your Desktop

Perplexity has officially launched its Personal Computer Mac app for all users, and this release marks a significant step forward in how artificial intelligence can interact with your local machine. Instead of just answering questions in a chat window, this AI agent works directly on your Mac — searching files, operating inside native apps, browsing the web, and connecting to over 400 tools — all without constant human supervision.

You give it a task, and it gets to work. You only step in when it needs your approval. It’s similar in spirit to Claude Cowork, but Perplexity’s approach is purpose-built for deep integration with macOS.

What Can Perplexity Personal Computer Do for You?

If you have ever lost an hour jumping between apps, hunting for a file, or manually pulling information from different sources, this tool is built for exactly that frustration. It can work across your local files, native Mac apps, connected tools, and the web — all in one go.

Here is a fun example Perplexity showcased: open Notes, press both CMD keys, and ask it to handle your to-do list. It figures out each task and gets them done. You can also ask it to clean up a messy folder and sort everything into properly named project folders.

Building on this, the best part is that you stay in control. It checks in when a decision matters, and every action it takes is reversible. Think of it as a capable assistant — not an unsupervised one.

What’s the Best Way to Run Perplexity Personal Computer?

According to Perplexity, a desktop Mac is the ideal environment. Since the entry cost of a Mac mini is relatively low, it’s the recommended option. Running the AI agent on a Mac mini allows it to operate continuously, even when you are away from your desk.

You can kick off a task from your iPhone, and the agents keep working on your Mac back home. You can also approve requests from any device and let it run overnight for complex projects.

However, the Mac mini is currently in short supply, so you might have to wait or grab any available model. The new Perplexity Personal Computer Mac app is available today for all users. Everyone gets everyday features like search, attachments, and dictation.

It is not yet on the App Store, so you will need to download it directly from Perplexity’s website. Learn more tips for using Perplexity AI effectively to get the most out of this tool.

How Autonomous AI Agents Are Changing Productivity

This launch reflects a broader shift in the AI landscape. Instead of passive chatbots, we now have autonomous AI agents that can act on your behalf. Perplexity’s approach is particularly notable because it works locally on your Mac, meaning your data stays on your device rather than being processed entirely in the cloud.

For power users, this opens up possibilities like automated research, file organization, and cross-app workflows that previously required complex scripting or third-party automation tools.

Practical Use Cases for Perplexity Personal Computer

  • Research and summarization: Ask it to gather information from multiple sources and compile a report.
  • File management: Request a cleanup of your Downloads folder into organized project directories.
  • Cross-app workflows: Have it pull data from a spreadsheet, create a chart in Keynote, and email the result.

Getting Started with Perplexity Personal Computer

To begin, head to Perplexity’s website and download the Mac app. Once installed, you can start giving it tasks immediately. The interface is straightforward: type or speak your request, and the agent takes over.

Remember that you can approve or reject actions at any time. This safety mechanism ensures you remain the decision-maker for critical operations. Explore other Mac productivity apps that complement this tool.

Final Thoughts on Perplexity Personal Computer

Perplexity has delivered a genuinely useful AI agent that feels like a natural extension of your Mac. It is not a gimmick — it solves real problems around multitasking and information management. As the technology matures, we can expect even deeper integration with macOS and third-party services.

Download it today and see how much time you can save by letting an AI agent handle the busywork.

Continue Reading

Artificial Intelligence

I Built a Mac App to Track My Bad Posture with AirPods — Without Writing a Single Line of Code

Published

on

I Built a Mac App to Track My Bad Posture with AirPods — Without Writing a Single Line of Code

Imagine wanting a custom app for a nagging problem, but you have zero coding experience. That was my reality a few weeks ago. I was tired of slouching at my desk, and existing solutions felt invasive or clunky. So, I decided to build a Mac app with AI that uses my AirPods’ motion sensors to detect bad posture. The best part? I never wrote a single line of code. I just talked to an AI chatbot, and it did all the heavy lifting.

This journey started with a simple idea: use the motion sensors inside AirPods to monitor posture changes, without relying on a webcam. I wanted something private, efficient, and personal. After experimenting with Claude from Anthropic, I realized that the barrier to creating functional software has crumbled. Now, anyone can build a Mac app with AI by describing their needs in plain English.

Why Move Away from Camera-Based Posture Tracking?

Earlier, I tested an open-source app that used my Mac’s webcam to detect slouching. It worked, but it raised serious privacy concerns. Every time the camera activated, I wondered: Is someone watching? Is my data being uploaded to a server? The app processed everything locally, but the unease remained. Many users shared similar fears on Reddit, questioning data storage and potential backdoors.

This pushed me to find an alternative. Instead of using a camera, why not tap into the motion sensors already in my AirPods Pro? These sensors track head movement and orientation. If I could calibrate good and bad postures, the AirPods could alert me when I slouch. The challenge was building the software — but I had no coding skills. That’s when I turned to Claude AI.

How I Built a Mac App with AI in Under an Hour

I opened Claude and typed: “I want to build a Mac app that uses AirPods motion sensors to detect bad posture and send notifications.” The AI asked a few clarifying questions — like whether I wanted a menu bar utility or a full-window app. I replied with simple yes/no answers. Within 30 minutes, Claude generated the entire codebase, including a menu bar icon, notification banners, calibration controls, and a two-stage warning system.

Claude even designed the app icon and saved everything neatly in a folder. I didn’t see a single line of Swift or Xcode. The AI handled all the technical details, from motion data parsing to animation logic. When I ran the compiled app, it worked flawlessly on the first try. No errors, no crashes. This experience showed me that no-code app development is not just a buzzword — it’s a practical reality.

The Calibration Process: Simple and Intuitive

Launching the app, it asked me to sit upright for a few seconds to record my “good posture.” Then, I slouched forward to capture the “bad posture.” The app used the AirPods’ gyroscope and accelerometer data to distinguish between the two. No manual inputs needed. Once calibrated, the app runs silently in the menu bar. When I sit straight, the icon stays grey. If I start slouching, it turns yellow, then red. After 12 seconds of poor posture, a notification pops up with a warning chime.

I tested the app with friends using second-gen AirPods Pro. They were surprised by the accuracy. The motion sensing was responsive, and the alerts felt helpful, not annoying. This confirmed that AirPods posture tracking is a viable alternative to camera-based systems.

Privacy First: On-Device Processing Keeps Data Safe

Privacy was my primary motivation. Many health apps upload data to cloud servers, exposing sensitive information to third parties. My app processes everything locally on the Mac. No data ever leaves the device. The AirPods sensors communicate via Bluetooth, and all analysis happens on-device. This approach eliminates the risk of data leaks or unauthorized access.

For anyone concerned about on-device health privacy, this is a game-changer. You don’t need to trust a developer’s privacy policy. You control the software entirely. If you want, you can even keep the app to yourself — never publishing it to an app store. This is the ultimate form of data sovereignty.

The Limitations of No-Code App Development

While the experience was empowering, I must be realistic. Building a personal utility is one thing; launching a commercial app is another. To publish on the App Store, you need a developer account, navigate Apple’s review process, and handle updates. For now, I have no plans to release this app publicly. The goal was to prove that build a Mac app with AI is possible for non-coders.

Tools like Claude excel at generating functional prototypes, but they have limits. Complex integrations (e.g., connecting to external APIs or payment systems) still require technical knowledge. However, for personal projects or internal tools, the barrier has never been lower. As AI coding assistants improve, the gap between idea and execution will shrink further.

What This Means for the Future of Software Creation

This experiment changed my perspective. I no longer feel helpless when a desired app doesn’t exist. Instead of waiting for a developer, I can prompt an AI to build it. The era of no-code app development is here, and it’s accessible to anyone with a clear idea and a willingness to experiment. Whether you want a posture tracker, a habit reminder, or a custom dashboard, the tools are ready.

For more insights on leveraging AI for productivity, check out our guide on AI productivity tools for Mac users. If you’re curious about other no-code solutions, read our comparison of best no-code platforms for beginners.

In the end, I built a Mac app with AI that solves a real problem — without writing a single line of code. If I can do it, so can you. The only limit is your imagination.

Continue Reading

Artificial Intelligence

Snapchat and Perplexity AI Part Ways: Inside the $400 Million Deal That Fell Apart

Published

on

Snapchat and Perplexity AI Part Ways: Inside the $400 Million Deal That Fell Apart

The Snapchat Perplexity AI deal is officially dead. Snap confirmed in its Q1 2026 investor letter that both companies “amicably ended the relationship in Q1,” terminating a $400 million cash-and-equity agreement announced in November. The partnership would have brought Perplexity‘s AI answering engine directly into Snapchat’s Chat interface, allowing users to ask questions and receive conversational, source-backed answers without leaving the app. However, the integration never materialized, leaving many wondering why.

Why Did Snapchat and Perplexity End Their Partnership?

Initial signs of trouble emerged in February, when Snap announced that it had not yet mutually agreed on a broader rollout plan with Perplexity. The deal also raised concerns about how AI search would function inside private messaging, particularly for younger users and sensitive topics. With hundreds of millions of daily users on Snapchat, integrating a chatbot seemed promising on paper, but practical hurdles proved insurmountable.

Furthermore, Snap’s latest sales guidance now assumes no contribution from Perplexity, a stark contrast to earlier projections that the partnership would begin generating revenue in 2026. The abrupt cancellation highlights the challenges tech companies face when embedding AI assistants into existing social platforms. Learn more about the rise of AI in messaging apps.

What Does This Mean for Snapchat Users?

The end of the Snapchat Perplexity AI deal does not mean Snapchat is abandoning AI altogether. The company recently introduced AI Sponsored Snaps, an ad format that lets brands place interactive AI chatbots inside Chat. So chatbot-style conversations may still appear in Snapchat, but through advertisements rather than a dedicated search engine.

Snap is also expanding features around Snap Map. Its new Place Loyalty feature ranks users based on how often they visit certain locations over the past year, awarding Gold, Silver, and Bronze status levels. Snap emphasizes that rankings remain private to the user, and location sharing is off by default. These moves suggest Snap is focusing on organic engagement rather than third-party AI integrations.

How Does Snapchat’s AI Strategy Compare to Competitors?

Tech giants are racing to embed AI assistants across their ecosystems. Meta has added Meta AI to WhatsApp, Instagram, Facebook, and Messenger. Google has integrated Gemini into Search, Android, Gmail, and other products. Snapchat seemed to be following a similar path with Perplexity, but the cancellation signals a different approach.

Instead of a universal AI chatbot, Snap is opting for controlled, ad-driven AI interactions. This strategy may appeal to advertisers but limits the scope of AI functionality for users. As a result, Snapchat’s AI capabilities remain narrower than those of competitors like Meta and Google. Compare Snapchat’s AI features with Meta’s offerings.

Is Snapchat Still Growing Without the Perplexity Deal?

Absolutely. In Q1, Snap’s global daily active users rose 5% year-over-year to 483 million, while monthly active users increased 5% to 965 million. The company credited growth to features across Snap Map, Lenses, and other parts of the app. This indicates that Snapchat can thrive without the Perplexity integration, relying on its core strengths in visual communication and augmented reality.

Building on this momentum, Snap is likely to continue investing in native AI tools rather than external partnerships. For now, the Snapchat Perplexity AI deal serves as a cautionary tale about the complexities of integrating AI into social platforms. Explore what’s next for Snapchat and AI.

Continue Reading

Trending