Day: July 22, 2025
BYD
- Kevin Bond bought a Tesla Model 3 with money from his pension after he retired.
- 5 years later, he replaced it with a BYD Seal after becoming uncomfortable with Elon Musk’s actions.
- Bond said the Seal is quicker and better built than the Model 3, with a nicer interior.
This as-told-to essay is based on a conversation with Kevin Bond, a retired director of mental health services from Devon, UK, about swapping his Tesla Model 3 for a BYD Seal. It has been edited for length and clarity.
I’ve lived in Devon for three years. I’m retired now, but I was a director of mental health services and the chief executive of a not-for-profit health and social care company.
I bought the Tesla Model 3 in 2020 when I retired, with the lump sum you get with your occupational pension, for my wife.
We sold it around three months ago. It just felt really uncomfortable that a single penny of our money would go anywhere near Elon Musk.
I’ve always been a bit uncomfortable with Musk, but as time went on, my wife and I became increasingly uneasy.
I think that there’s a point where it’s beyond just unpleasant, and you believe this guy is actively creating hate and division between people.
When the [Southport] riots happened, it felt like he was fanning the flames. I’m not at all comfortable with that, and I don’t want to be associated with it.
I think he stepped over a line. His support for far-right parties in Germany and his spreading of misinformation is just disgraceful.
Model 3 woes
If I’m honest, I never really liked the Model 3 that much.
There was no denying that when you first got in, it was exciting. You put your foot down, and it goes very fast. But for me, it was not very comfortable.
The seats weren’t well made, and the interior was cheap and nasty. Many of the important settings were adjusted via the display screen, so your eyes were off the road when they should have been on the road.
The doors weren’t fitted that well, so you got quite a bit of cabin noise, and after a couple of years, the suspension started squeaking heavily.
It just felt cheap. Personally, I didn’t think it was very well put together.
Initially, the service was OK, but after a few years, the response was awful. It’s probably the worst service I’ve ever had from any car dealer.
When we sold the Tesla, it had the worst depreciation of any car I’ve ever owned. We sold it for just over £10,000 [$13,500] after buying it for nearly £50,000.
[Tesla did not immediately respond to a request for comment from Business Insider.]
Buying a BYD
Kevin Bond
When we started looking for a new car, I thought that we would have to compromise.
What we found was that there are quite a lot of cars that have caught up and overtaken Tesla, both in terms of quality, but also on range and price.
We got the BYD Seal around the same time we sold the Tesla. We bought an ex-demonstrator vehicle that was a year old.
It’s quicker than the Tesla — it accelerates from 0 to 100km/h in 3.8 seconds — and it has a longer range. You can also charge it to 100% without damaging the battery.
It’s just a beautifully built car, and very comfortable. If you shut the doors, they go thunk, as they should, and there’s no wind noise in the cabin.
It’s got ambient lighting, vented seats, and head-up displays, all the things that you might expect in a car that you pay a lot of money for.
It’s hard to find things to fault with it. The media system is a bit clunky and glitchy, and it’s a tiny bit slower to charge. However, we do most of our charging at home, and with the Seal having more range, the slower charge time isn’t really an issue.
It drives beautifully on the road. You would put it through a corner in a way that I wouldn’t drive in the Tesla. It feels safe and it feels solid.
If I could give scores, I’d give the BYD nine out of ten and probably one out of ten for the Tesla.
Have you swapped your Tesla for a BYD or bought a Chinese EV? Contact this reporter at tcarter@businessinsider.com.
Maskot/Getty Images/Maskot
- The tech industry faces ongoing layoffs amid rapid AI advancements and other economic pressures.
- Some tech roles are increasingly in-demand, while others are experiencing a decline.
- Do you work in tech and have thoughts on how the industry is changing? Take BI’s survey below.
The tech job cuts just keep on coming.
Industry layoffs from Jan to May are up 35% compared to the same period in 2024, according to career transition firm Challenger, Gray & Christmas.
While the reasons for workforce reductions vary by company, the layoffs have come during a rapid technological shift driven by the emergence of AI.
A 2025 World Economic Forum survey found that 41% of companies globally expect to reduce staff in the next five years because of AI. While no one knows exactly how many jobs will be lost due to “AI exposure”, the tech industry might be especially vulnerable.
As some roles related to AI research and development have grown in demand, others have been on the decline. Job postings for software engineers, once a staple at tech companies, have decreased as tools like Codex and GitHub CoPilot have automated coding tasks, which make up a large part of the job, especially for early-career workers.
Amid a shifting landscape, tech leaders have expressed differing opinions about how AI will transform the job market. Some say that AI will create more opportunities to build and, as a result, more jobs. Others, like Anthropic’s Dario Amodei, have issued dismal warnings about the imminent elimination of white-collar roles.
Do you work in tech? Let us know how you see the industry changing in the survey below:
KEVIN LAMARQUE/Getty, Getty Images, Tyler Le/BI
- xAI workers recorded facial expressions for AI training, internal documents show.
- The project aimed to help Grok interpret human emotions and facial movements, workers were told.
- Workers expressed concern about how the data would be used.
Workers at Elon Musk’s xAI have been asked to instill anti-“wokeness” in Grok and stop the chatbot from impersonating Musk. Recently, some were also asked to record their facial expressions to train the LLM — and they weren’t happy.
In April, more than 200 employees took part in an internal project called “Skippy,” which involved recording videos of themselves to help train the AI model to interpret human emotions.
Internal documents and Slack messages viewed by Business Insider show that the project left many workers uneasy, with some raising alarms about how their likenesses might be used. Others opted out entirely.
Over a weeklong period, AI tutors — the workers who help train Grok, the company’s large language model — were tasked with recording videos of themselves speaking to coworkers as well as making facial expressions, internal documents show.
The project was designed to train the company’s AI model to “recognize and analyze facial movements and expressions, such as how people talk, react to others’ conversations, and express themselves in various conditions,” according to one document.
The tutors were scheduled for 15- to 30-minute conversations with their coworkers. One person played the part of the “host” — the virtual assistant — and the other would take on the role of a user. The “host” minimized their movements and prioritized proper framing, while those playing the user could operate off a cellphone camera or computer and move freely in order to simulate a casual conversation with a friend.
It’s unclear whether that training data had any role in powering Rudi and Ani, two lifelike avatars that xAI released last week that were quickly shown stripping, flirting, and threatening to bomb banks.
The lead engineer on the project told workers during an introductory meeting that the project would help “give Grok a face,” according to a recording viewed by BI. The project lead said that the company might eventually use the data to build out “avatars of people.”
The project lead said xAI wanted imperfect data — background noise and sudden movements, for example — because the AI system would be more limited in its responses if it were trained solely on perfect video and audio feedback.
They told staff that the videos would not be distributed outside the company, and were solely for training purposes.
“Your face will not ever make it to production,” the engineer on the project told workers during the kick-off call. “It’s purely to teach Grok what a face is.”
The workers were given tips on how to have a successful one-on-one conversation, including avoiding one-word answers, asking follow-up questions, and maintaining eye contact. The company also supplied staff with a variety of conversation topics. Examples included: “How do you secretly manipulate people to get your way?”, “What about showers? Do you prefer morning or night?”, and “Would you ever date someone with a kid or kids?”
Before filming, workers were required to sign a consent form granting xAI “perpetual” access to the data, including the workers’ “likeness” for training and also for “inclusion in and promotion of commercial products and services offered by xAI.” The form specified the data would be used for training purposes and “not to create a digital version of you.”
Dozens of workers expressed concerns about the use of the data and the consent form, and several said they chose to opt out of the program, according to Slack messages viewed by BI.
“My general concern is if you’re able to use my likeness and give it that sublikeness, could my face be used to say something I never said?” one worker said during the introductory meeting.
A spokesperson for xAI did not respond to a request for comment.
In April, xAI launched a feature that allowed users to video chat with Grok.
On July 14, the company released its Ani and Rudi avatars, a few days after its larger Grok 4 release. The two animated characters respond to questions and commands. When they talk, their lips move and they make realistic gestures.
The female avatar, Ani, has had sexually explicit conversations with users and can be prompted to remove her clothing, videos posted by users on X show. The other avatar, a red panda named Rudi, can be prompted to make violent threats, including bombing banks and killing billionaires, user videos show.
Musk’s AI company posted a new job focused on developing avatars on July 15. Musk said on Wednesday the company is working on a Grok companion inspired by Edward Cullen from “Twilight” and Christian Grey from “50 Shades of Grey.”
On July 9, xAI’s chatbot sparked backlash after it went on an antisemitic rant. Workers within the company erupted over the posts, and xAI apologized for the chatbot’s behavior on X.
On July 12, the company released a Grok variant for Tesla owners and a $300-per-month subscription plan for a more sophisticated version of Grok, called SuperGrok Heavy
Do you work for xAI or have a tip? Contact this reporter via email at gkay@businessinsider.com or Signal at 248-894-6012. Use a personal email address, a nonwork device, and nonwork WiFi; here’s our guide to sharing information securely.
Courtesy of Hanut Singh
- Hanut Singh, 30, has been a robotics application engineer for almost five years.
- He credits AI advancements in robotics for creating his role at Chef Robotics in Dallas.
- The demand for robotics application engineers will grow as companies need deployment and customer skills.
This as-told-to essay is based on a conversation with Hanut Singh, a 30-year-old lead robotics application engineer at Chef Robotics based in Dallas. It’s been edited for length and clarity.
Before the AI revolution, we had the classical robotics scene, like the robots that make cars in factories. After the machine learning and AI boom, advanced robots emerged. For example, Teslas can now autonomously drive on the street.
As a robotics application engineer at Chef Robotics, I act as the bridge between AI models and messy real-world environments. I’m about to start my fifth year doing this work.
People always talk about how AI or robotics will take jobs, but there’s a flip side to this — it created mine.
The pre- and post-AI eras needed two very different kinds of robotics engineers
In the pre-AI era, robots were hard-coded to perform the same movement over and over a thousand times a day. Any change would cause them to fail to adapt. Post-AI boom robots adapt to their environment.
Society needs engineers who understand robotics, AI, and customers. Unlike traditional robots, the new generation of robots needs constant internet access and communicates via the cloud.
If you have a warehouse with an open floor plan, the robots working there can get lost. A robotics application engineer’s job is to figure out where to include these smart features, like spatial awareness. We have to go to a customer site, look at their requirements, determine what they need, and develop our smart tech accordingly.
When I graduated from my master’s program, there weren’t a lot of robotics roles out there
When I graduated with my master’s in electrical engineering with a specialty in robotics in 2020, my first job out of school was very development-oriented. There weren’t many AI-driven, dynamic automation roles in the industry yet.
While in my first role, I applied for an application engineer role at Fetch Robotics. The job posting caught my attention because the company said it wanted a robotics engineer who understands AI robotics, but in a customer-facing role, doing deployments.
Mentorship and gradual experience helped me further my career in AI
I came in confident on the engineering side and a little less confident on the sales engineering side. Thanks to the mentorship of senior applications engineers, I quickly grew into the customer-facing and sales engineering aspects of the role.
I then became a senior application engineer and later got an offer from Chef Robotics. The company was at a point where it had a product, but it didn’t know how to deploy it yet. I came in as one of the company’s first application engineers in 2023, and now I’m a lead application engineer.
The salary ranges for this type of job from company to company. For an application engineer, the salary can be anywhere from $120,000 to $200,000. If you become a lead or a manager, it increases even more.
My company typically hires new graduates for our robotics application engineering roles
There was no role like mine a decade ago. You need an engineer in robotics who is well-versed in AI, with the confidence of a salesperson to talk to customers. Not many candidates have both these skillsets. Another reason it gets tough is that it’s a new role, so it’s difficult to find someone who’s experienced.
Looking back at when I got my first role as a robotics application engineer, I didn’t yet have the full skill set, but the company took a chance on me. Since I’ve started hiring for the role, it’s become clear just how difficult it is to find someone with direct experience for this type of job.
We typically end up hiring someone who has done traditional automation but not dynamic automation. Now that I’ve been with multiple companies, I know that what usually happens is we make a team with a few working engineers and then take recent graduates who we train into the role.
My team uses LinkedIn to scout for candidates
A lot of folks see sales and robotics in the job description and apply without understanding the role. Salesmen will apply, and hardcore roboticists will apply, so we end up scouting.
On LinkedIn, we mainly look for experienced engineers at Bay Area robotics startups. If we want recent graduates, we recruit from places like Carnegie Mellon University or Massachusetts Institute of Technology.
The best thing to do if you want to become a robotic application engineer is to study robotics, but try to get some customer-facing experience. It might not be a robotics role — you might just get into it as an automation engineer at a warehouse.
In my experience, the best way to prepare for a role like this is to get real-world experience by working as an automation engineer in a warehouse and, at the same time, gain knowledge of robotics and AI by taking courses in them.
I think this is one of the safest jobs in tech right now
At this point, everybody’s heard of vibe coding, which is using AI tools to do the heavy lifting of coding software. These software teams are becoming smaller. When it comes to deploying the technology, AI cannot deploy the robots. This is a human-in-the-loop job in AI.
Robots are smart, but only when guided by someone who understands their limits and strengths. As this technology advances and new features emerge, a robotics application engineer will have more work to do.
I see the demand increase every day as new robotics companies pop up and they realize that they need someone to actually do the sales deployment and engineering to deploy these robots on-site.
If you have a career journey or AI story that you would like to share, please email this reporter, Agnes Applegate, at aapplegate@businessinsider.com.
