导读
前自动驾驶行业高管亲身经历特斯拉FSD突然失控撞墙,家人虽无重伤,却揭示了“近乎完美”的自动驾驶与AI的共同陷阱:系统长期可靠会麻痹人类注意力,一旦失灵,责任却全由用户承担。文章对比自动驾驶与生成式AI,指出二者都把人推入“道德缓冲区”,呼吁企业分担风险、明确责任,而非让用户成为试验品。
My Tesla Was Driving Itself Perfectly—Until It Crashed
The danger of almost-perfect tech
我的特斯拉自动驾驶一直很完美——直到它撞了车
近乎完美的技术所隐藏的危险
生词 & 音标
- perfectly /ˈpɜːfɪkli/ adv. 完美地
- crash /kræʃ/ v./n. 碰撞,撞车
- almost /ˈɔːlməʊst/ adv. 几乎
- illustration /ˌɪləˈstreɪʃn/ n. 插图
短语
- drive itself 自动驾驶
- almost-perfect tech 近乎完美的技术
A black steering wheel with human hands surrounded and trapped by sinuous blue artificial arms and hands
一只黑色方向盘,人类的手被弯曲缠绕的蓝色机械手臂与手掌包围、困住
生词 & 音标
- steering wheel /ˈstɪərɪŋ wiːl/ n. 方向盘
- surround /səˈraʊnd/ v. 包围
- trap /træp/ v. 困住
- sinuous /ˈsɪnjuəs/ adj. 蜿蜒的,弯曲的
- artificial /ˌɑːtɪˈfɪʃl/ adj. 人工的,机械的
短语
- artificial arms 机械臂
3
The smell was strange. Sharp. Chemical. Wrong. The concrete wall was too close. My glasses were gone. One of my kids was standing on the sidewalk next to our car—not crying, just confused.
The seat belt had held. The crumple zone had crumpled. The airbag had fired. Everything designed to protect bodies had done its job. But the car, a Tesla Model X, was totaled.
气味很奇怪,刺鼻、带着化学味,非常不对劲。混凝土墙离得太近了。我的眼镜不见了。我的一个孩子站在车旁的人行道上——没有哭,只是一脸茫然。
安全带拉住了我,防撞溃缩区正常溃缩,安全气囊也弹开了。所有用来保护人体的设计都发挥了作用。但这辆特斯拉Model X彻底报废了。
生词 & 音标
- sharp /ʃɑːp/ adj. 刺鼻的
- chemical /ˈkemɪkl/ adj. 化学的
- concrete /ˈkɒŋkriːt/ adj. 混凝土的
- sidewalk /ˈsaɪdwɔːk/ n. 人行道
- confused /kənˈfjuːzd/ adj. 困惑的
- seat belt /ˈsiːt belt/ n. 安全带
- crumple /ˈkrʌmpl/ v. 挤压,溃缩
- zone /zəʊn/ n. 区域
- airbag /ˈeəbæɡ/ n. 安全气囊
- fire /ˈfaɪə(r)/ v. 触发,弹出
- total /ˈtəʊtl/ v. (车辆)彻底报废
短语
- concrete wall 混凝土墙
- seat belt held 安全带起作用
- crumple zone 溃缩区(汽车安全结构)
- airbag fired 安全气囊弹出
- be totaled 车辆报废
背景
- crumple zone:汽车溃缩吸能区,通过变形吸收撞击力,保护乘员。
4
One Sunday last fall, my kids and I were on a drive we’d done hundreds of times, winding through the residential streets of the Bay Area to drop my son off at his Boy Scouts meeting. The Tesla was in Full Self-Driving mode, driving perfectly—until it wasn’t.
去年秋天的一个周日,我和孩子们行驶在一条开过几百次的路上,蜿蜒穿过湾区的居民区街道,送儿子去参加童子军活动。当时特斯拉正处于完全自动驾驶(FSD)模式,开得非常完美——直到它突然不完美了。
生词 & 音标
- wind /waɪnd/ v. 蜿蜒前行
- residential /ˌrezɪˈdenʃl/ adj. 住宅的
- drop…off 送……到某处
- scout /skaʊt/ n. 童子军
短语
- residential streets 居民区街道
- the Bay Area (美国)旧金山湾区
- Boy Scouts 童子军
- Full Self-Driving mode 完全自动驾驶模式(FSD)
5
What happened next, I’ve had to piece together. My memory is hazy, and some of it comes from one of my sons, who watched the whole thing unfold from the back seat. The car was making a turn. Something felt off—the steering wheel jerked one way, then the other, and the car decelerated in a way I didn’t expect. I turned the wheel to take over. I don’t know exactly what the system was doing, or why. I only know that somewhere in those seconds, we ended up colliding with a wall.
接下来发生的事,我只能拼凑起来。我的记忆很模糊,一部分来自坐在后座目睹全程的儿子。车子当时正在转弯,感觉有点不对劲——方向盘猛地往一边甩,又往另一边甩,车子以一种我完全没预料到的方式减速。我赶紧打方向盘接管车辆。我不清楚系统当时到底在做什么、为什么会这样。我只知道,在那几秒钟里,我们最终撞上了墙。
生词 & 音标
- piece together 拼凑
- hazy /ˈheɪzi/ adj. 模糊的
- unfold /ʌnˈfəʊld/ v. 展开,发生
- jerk /dʒɜːk/ v. 猛拉,急转
- decelerate /ˌdiːˈseləreɪt/ v. 减速
- take over 接管
- collide /kəˈlaɪd/ v. 碰撞
短语
- make a turn 转弯
- feel off 感觉不对劲
- steering wheel jerked 方向盘猛抖
- end up doing 最终……
- collide with a wall 撞墙
6
You might think I’d have known what to do in this situation. I used to run the self-driving-car division at Uber, trying to build a future in which technology protects us from accidents. I had thought about edge cases, failure modes, the brittleness hiding behind smooth performance. My team trained human drivers on when and how to intervene if a self-driving car made a mistake. In the two years I ran the division, we had no injuries in our early pilot programs.
你可能以为我在这种情况下知道该怎么做。我曾经负责优步(Uber)的自动驾驶部门,致力于打造一个由技术保护我们远离事故的未来。我研究过极端场景、失效模式,以及流畅表现背后隐藏的脆弱性。我的团队专门培训人类驾驶员,在自动驾驶车辆出错时何时介入、如何介入。在我管理该部门的两年里,我们的早期试点项目从未出现过人员受伤。
生词 & 音标
- division /dɪˈvɪʒn/ n. 部门
- edge case /edʒ keɪs/ n. 极端案例,边界场景
- failure mode /ˈfeɪljə məʊd/ n. 失效模式
- brittleness /ˈbrɪtnəs/ n. 脆弱性
- smooth /smuːð/ adj. 流畅的
- intervene /ˌɪntəˈviːn/ v. 干预,介入
- pilot /ˈpaɪlət/ adj. 试点的
短语
- self-driving-car division 自动驾驶部门
- protect…from… 保护……免受……
- smooth performance 流畅的表现
- pilot programs 试点项目
7
With my own Tesla, I started out using Full Self-Driving as the default setting only on highways. That’s where it makes sense: You have clear lane markers and predictable traffic patterns. Then, one day, I tried it on a local road, and it worked well enough to become a habit.
我自己的特斯拉,一开始我只在高速公路上默认使用完全自动驾驶。那才是它真正合理的场景:车道线清晰,交通模式可预测。后来有一天,我在本地道路上试了一次,效果还不错,于是慢慢成了习惯。
生词 & 音标
- default /dɪˈfɔːlt/ adj. 默认的
- highway /ˈhaɪweɪ/ n. 高速公路
- lane /leɪn/ n. 车道
- marker /ˈmɑːkə(r)/ n. 标识
- predictable /prɪˈdɪktəbl/ adj. 可预测的
- pattern /ˈpætn/ n. 模式
短语
- default setting 默认设置
- lane markers 车道线
- traffic patterns 交通模式
- local road 本地道路,城区道路
8
Despite the accident, we were lucky. I walked away with a stiff neck, a concussion, a few days of headaches, and some memories I can’t shake. The kids climbed out unharmed. Still, you could say I was crushed in what the researcher Madeleine Clare Elish calls the moral crumple zone. Some parts of a car are specifically designed to absorb damage in a crash, to protect the people inside. But when complex automated systems fail, Elish argues, it’s the human users who take the blame. My car’s Full Self-Driving mode logged flawless miles for three years, but when the accident happened, it was my name on the insurance report.
尽管出了事故,我们还算幸运。我只是脖子僵硬、脑震荡,头疼了几天,还有一些挥之不去的记忆。孩子们毫发无损地爬了出来。即便如此,可以说我陷入了研究者玛德琳·克莱尔·伊莱什所说的道德缓冲区。汽车的某些部件专门设计用来在碰撞中吸收冲击力、保护车内人员。但伊莱什认为,当复杂的自动化系统失灵时,承担责任的却是人类用户。我的车的自动驾驶模式三年来记录下无数完美里程,可事故发生时,保险报告上写的却是我的名字。
生词 & 音标
- stiff /stɪf/ adj. 僵硬的
- concussion /kənˈkʌʃn/ n. 脑震荡
- headache /ˈhedeɪk/ n. 头痛
- unharmed /ʌnˈhɑːmd/ adj. 未受伤的
- crush /krʌʃ/ v. 压垮,使无助
- moral /ˈmɒrəl/ adj. 道德的
- absorb /əbˈzɔːb/ v. 吸收
- automated /ˈɔːtəmeɪtɪd/ adj. 自动化的
- blame /bleɪm/ n. 责任,责备
- log /lɒɡ/ v. 记录
- flawless /ˈflɔːləs/ adj. 完美无瑕的
- insurance /ɪnˈʃʊərəns/ n. 保险
短语
- stiff neck 脖子僵硬
- can’t shake memories 挥之不去的记忆
- moral crumple zone 道德缓冲区
- absorb damage 吸收撞击伤害
- automated systems 自动化系统
- take the blame 承担责任
- flawless miles 零故障行驶里程
- insurance report 保险报告
难点
- moral crumple zone:道德缓冲区,类比汽车溃缩区,指系统出事时,人类被设计成“承担伤害与责任”的缓冲角色。
9
And the car has evidence. While you’re at the wheel, it logs your hand position, your reaction time, whether you’re keeping your eyes on the road—thousands of data points, processed by the vehicle. After crashes, Tesla has used these data to shift blame onto drivers. Following a fatal collision in Mountain View, California, in 2018, the company released a statement in which it noted that “the vehicle logs show that no action was taken.” (Tesla did not respond to a request for comment.)
车辆还掌握着证据。你开车时,它会记录你的手部位置、反应时间、是否注视路面——成千上万的数据点,由车辆实时处理。每次撞车后,特斯拉都会用这些数据把责任推给驾驶员。2018年加州山景城发生一起致命碰撞事故后,公司发布声明称“车辆日志显示驾驶员未采取任何行动”。(特斯拉未回应本文置评请求。)
生词 & 音标
- evidence /ˈevɪdəns/ n. 证据
- reaction /riˈækʃn/ n. 反应
- data point /ˈdeɪtə pɔɪnt/ n. 数据点
- shift /ʃɪft/ v. 转移
- fatal /ˈfeɪtl/ adj. 致命的
- collision /kəˈlɪʒn/ n. 碰撞
- release /rɪˈliːs/ v. 发布
- statement /ˈsteɪtmənt/ n. 声明
短语
- at the wheel 驾驶中
- hand position 手部位置
- reaction time 反应时间
- shift blame onto sb 把责任推给某人
- fatal collision 致命碰撞
- request for comment 置评请求
10
While Tesla can access these records, it’s not so easy for drivers. They can request their data, but some say they’ve received only fragments—and have had to go to court to get more. When plaintiffs in a Florida wrongful-death case sought key evidence of how one of Tesla’s driver-assistance systems had failed, the company said it didn’t have the data. The plaintiffs had to hire a hacker, who recovered them from a computer chip in the crashed vehicle. Later, Tesla stated that the data had been sitting on its own servers for years, and that the company failed to locate them by mistake. (A judge did not find “sufficient evidence” to conclude that Tesla had sought to hide the data.)
特斯拉可以轻易获取这些记录,但驾驶员却很难。他们可以申请自己的数据,但有人说只得到了碎片信息,还不得不通过打官司获取更多内容。佛罗里达州一起 wrongful-death 诉讼中,原告想要获取特斯拉驾驶辅助系统失灵的关键证据,公司却称没有这些数据。原告不得不雇佣黑客,从事故车辆的芯片中恢复了数据。之后特斯拉表示,这些数据其实一直在自己的服务器里存了好几年,只是公司失误没找到。(法官认为没有“充分证据”证明特斯拉故意隐瞒数据。)
生词 & 音标
- access /ˈækses/ v. 获取
- fragment /ˈfræɡmənt/ n. 碎片,片段
- plaintiff /ˈpleɪntɪf/ n. 原告
- wrongful /ˈrɒŋfl/ adj. 非法的,不公正的
- seek /siːk/ v. 寻求
- hacker /ˈhækə(r)/ n. 黑客
- recover /rɪˈkʌvə(r)/ v. 恢复
- chip /tʃɪp/ n. 芯片
- server /ˈsɜːvə(r)/ n. 服务器
- locate /ləʊˈkeɪt/ v. 找到
- sufficient /səˈfɪʃnt/ adj. 充分的
短语
- go to court 打官司
- wrongful-death case 非正常死亡诉讼案
- driver-assistance systems 驾驶辅助系统
- computer chip 电脑芯片
- sufficient evidence 充分证据
11
My car didn’t warn me when it was confused. Chatbots don’t, either; they deliver their results in the same confident voice, whether they’re right or hallucinating. They perform expertise, even when the sources they cite are dubious or fabricated. They use technical language in an authoritative tone. And we believe them, because why wouldn’t we? They’ve been right so many times before.
我的车在“困惑”时并没有警告我。聊天机器人也是如此:无论答案正确还是在一本正经地胡说(幻觉),它们都用同样自信的语气输出结果。它们装作很专业的样子,即便引用的来源可疑甚至是编造的。它们用权威的口吻讲着专业术语。而我们选择相信它们,毕竟它们之前对过那么多次。
生词 & 音标
- warn /wɔːn/ v. 警告
- confused /kənˈfjuːzd/ adj. 困惑的
- chatbot /ˈtʃætbɒt/ n. 聊天机器人
- confident /ˈkɒnfɪdənt/ adj. 自信的
- hallucinate /həˈluːsɪneɪt/ v. (AI)产生幻觉,编造信息
- expertise /ˌekspɜːˈtiːz/ n. 专业知识
- cite /saɪt/ v. 引用
- dubious /ˈdjuːbiəs/ adj. 可疑的
- fabricated /ˈfæbrɪkeɪtɪd/ adj. 编造的
- authoritative /ɔːˈθɒrətətɪv/ adj. 权威的
短语
- perform expertise 装作专业
- technical language 专业术语
- authoritative tone 权威的语气
难点
- hallucinating:AI 幻觉,指模型编造事实、引用不存在的文献或数据。
12
Cars train us mile by mile; AI trains us week by week. In week one, you read a chatbot’s output carefully. By week three, you’re copying and pasting without reading. The errors don’t disappear, but your vigilance does. So does your judgment, until one day you realize that you can’t remember which ideas in a memo were yours and which were generated by AI. What does it say about us that we’ve handed over our thinking so willingly?
汽车一英里一英里地“训练”我们,AI则一周一周地驯化我们。第一周,你还会仔细看聊天机器人的输出;到第三周,你直接复制粘贴,看都不看。错误并没有消失,但你的警惕性消失了,判断力也随之下降。直到有一天,你发现自己记不清备忘录里哪些想法是自己的,哪些是AI生成的。我们如此心甘情愿地交出思考权,这说明了什么?
生词 & 音标
- train /treɪn/ v. 训练,驯化
- vigilance /ˈvɪdʒɪləns/ n. 警惕性
- judgment /ˈdʒʌdʒmənt/ n. 判断力
- memo /ˈmeməʊ/ n. 备忘录
- willingly /ˈwɪlɪŋli/ adv. 心甘情愿地
短语
- copy and paste 复制粘贴
- hand over 交出
- generated by AI 由AI生成
13
When my car failed, it was immediate and palpable. With chatbots, the failure is silent and invisible. You find out about it later, if at all—after the email is sent, the decision made, the code shipped. By the time you catch the mistake, it’s already out there with your name on it. When the system works, you look efficient. When it fails, your judgment is questioned, sometimes with catastrophic consequences. In 2023, a New York lawyer was sanctioned for citing six cases that didn’t exist. ChatGPT had invented them, but he’d trusted it, and the court blamed him, not the tool. Because a chatbot never gets fired.
我的汽车失灵时,后果是即时且显而易见的。但聊天机器人的失灵是沉默且隐形的。你往往事后才发现——邮件发出、决策做出、代码上线之后。等你发现错误时,问题已经顶着你的名字扩散出去了。系统正常时,你显得高效能干;系统失灵时,被质疑的是你的判断力,有时还会带来灾难性后果。2023年,纽约一名律师因引用了六个不存在的案例而受到处罚。这些案例都是ChatGPT编造的,但他信任了AI,法院惩罚的是他,而不是工具。因为聊天机器人永远不会被解雇。
生词 & 音标
- immediate /ɪˈmiːdiət/ adj. 立即的
- palpable /ˈpælpəbl/ adj. 明显的
- silent /ˈsaɪlənt/ adj. 沉默的
- invisible /ɪnˈvɪzəbl/ adj. 隐形的
- efficient /ɪˈfɪʃnt/ adj. 高效的
- catastrophic /ˌkætəˈstrɒfɪk/ adj. 灾难性的
- consequence /ˈkɒnsɪkwəns/ n. 后果
- sanction /ˈsæŋkʃn/ v. 处罚,制裁
- invent /ɪnˈvent/ v. 编造,发明
短语
- catastrophic consequences 灾难性后果
- cite cases 引用案例
- get fired 被解雇
14
We’re experiencing an uncanny valley of autonomy. Computer systems aren’t just almost human; they are almost capable of working on their own. When they fail, someone has to absorb the cost. Right now, that someone is us. But when we pay for a self-driving car or an AI tool, we think we’re buying a finished product, not signing up to test a work in progress.
我们正处在自动驾驶的恐怖谷之中。计算机系统不仅几乎像人,而且几乎可以完全独立运行。当它们失败时,总得有人承担代价。现在,这个人就是我们。但当我们购买自动驾驶汽车或AI工具时,我们以为自己买的是成熟成品,而不是报名去测试一个仍在开发中的产品。
生词 & 音标
- uncanny /ʌnˈkæni/ adj. 诡异的
- valley /ˈvæli/ n. 山谷
- autonomy /ɔːˈtɒnəmi/ n. 自主,自动驾驶
- capable /ˈkeɪpəbl/ adj. 有能力的
- absorb /əbˈzɔːb/ v. 承担(成本/后果)
- finished /ˈfɪnɪʃt/ adj. 完成的,成品的
- progress /ˈprəʊɡres/ n. 进展
短语
- uncanny valley 恐怖谷
- work in progress 在制品,未完成产品
背景
- uncanny valley:恐怖谷理论,指物体越像人但又不完全像人时,会让人感到不适与恐惧。
15
This “almost” phase isn’t a brief transition. It’s the product—one that will be with us for years, maybe decades. So it’s important to notice the patterns. When an AI system never admits uncertainty, or when a car’s marketing says “self-driving” but the fine print says “driver responsible,” that’s a warning sign. When you realize that you haven’t really been paying attention for the past 10 miles, or the past 10 auto-composed emails, that’s the trap.
这个“近乎完美”的阶段并不是短暂过渡。它本身就是产品形态——会伴随我们数年甚至数十年。因此,看清其中的模式很重要。当一个AI系统从不承认自己不确定,当汽车营销宣传“自动驾驶”,但小字条款写着“驾驶员负责”时,这就是警告信号。当你发现自己在过去10英里路程、或过去10封自动生成的邮件里都没有真正集中注意力时,你已经落入陷阱。
生词 & 音标
- phase /feɪz/ n. 阶段
- brief /briːf/ adj. 短暂的
- transition /trænˈzɪʃn/ n. 过渡
- decade /ˈdekeɪd/ n. 十年
- uncertainty /ʌnˈsɜːtnti/ n. 不确定性
- marketing /ˈmɑːkɪtɪŋ/ n. 营销
- fine print 小字条款
- trap /træp/ n. 陷阱
短语
- warning sign 警告信号
- fine print 合同小字(免责条款)
- auto-composed emails 自动生成的邮件
16
Things don’t have to be this way, but they won’t change unless consumers see the situation clearly and refuse to accept it. We should reject the deal we’ve been handed—the one where the terms of service become a shield for companies and a sword against users. We should demand that companies share the risk they’re enticing us into taking. If they design for complacency, they should get some of the blame when their product fails.
事情本不必如此,但除非消费者看清现状并拒绝接受,否则不会改变。我们应该拒绝这种被强加的协议:服务条款成为企业的盾牌,却变成刺向用户的利剑。我们应该要求企业与我们共同承担它们引诱我们卷入的风险。如果它们的设计会让人放松警惕,那么产品失灵时,企业也应承担部分责任。
生词 & 音标
- consumer /kənˈsjuːmə(r)/ n. 消费者
- reject /rɪˈdʒekt/ v. 拒绝
- shield /ʃiːld/ n. 盾牌
- sword /sɔːd/ n. 剑
- entice /ɪnˈtaɪs/ v. 引诱
- complacency /kəmˈpleɪsnsi/ n. 自满,松懈
短语
- terms of service 服务条款
- share the risk 分担风险
- design for complacency 让人松懈的设计
17
This isn’t a utopian goal. In July 2025, the Chinese carmaker BYD announced that it would pay for the damage caused by crashes involving its self-parking feature, sparing the driver’s insurance and record. It’s only one company, and only one feature, but it proves that accountability is a choice. Other businesses can be persuaded to opt in, too.
这并非不切实际的乌托邦目标。2025年7月,中国汽车制造商比亚迪宣布,将为其自动泊车功能引发的碰撞事故承担赔偿,不影响车主保险与记录。虽然只有一家公司、一个功能,但这证明责任归属是一种选择。其他企业也可以被说服效仿。
生词 & 音标
- utopian /juːˈtəʊpiən/ adj. 乌托邦的
- carmaker /ˈkɑːmeɪkə(r)/ n. 汽车制造商
- self-parking /ˌself ˈpɑːkɪŋ/ n. 自动泊车
- spare /speə(r)/ v. 免除,使免遭
- accountability /əˌkaʊntəˈbɪləti/ n. 责任,问责
- opt in 选择加入
短语
- self-parking feature 自动泊车功能
- spare insurance 不影响保险
- accountability is a choice 责任是一种选择
18
My kids were in the back seat when I had my car accident. One day, they’ll have their own cars and use AI in ways that I can’t even imagine yet. The systems they inherit will be built either to elevate them or to lull them and blame them when things go wrong. I want them to notice when they’re being trained. I want them to ask who absorbs the cost, and the damage.
我出车祸时,孩子们就在后座。总有一天,他们会拥有自己的汽车,使用我现在无法想象的AI技术。他们继承的系统,要么是为了成就他们,要么是为了麻痹他们,并在出事时把责任推给他们。我希望他们能意识到自己正在被“驯化”,希望他们会去问:谁在承担代价,谁在承担伤害?
生词 & 音标
- inherit /ɪnˈherɪt/ v. 继承
- elevate /ˈelɪveɪt/ v. 提升,成就
- lull /lʌl/ v. 麻痹,使放松
短语
- go wrong 出错,失灵
- absorb the cost and damage 承担代价与伤害