您当前的位置:t7t8网 >  教案 >杀手英语怎么说

杀手英语怎么说

更新时间:2022-04-20 17:19:29 点击: 来源:yutu

  相信大家对杀手这个词一点都不陌生,我们可以从各种电影、电视剧中了解这一特殊职业。今天学习啦小编在这里为大家介绍杀手用英语怎么说,欢迎大家阅读!

杀手英语怎么说

  杀手的英语说法

  killer

  slayer

  杀手的相关短语

  杀手本能 Killer Instinct ;

  职业杀手 professional killer

  杀手的英语例句

  1. The vital clue to the killer's identity was his nickname, Peanuts.

  查明杀手身份的重要线索是他的外号叫“花生”.

  2. Depression is the third thing that works to my patients' disadvantage.

  抑郁是威胁我的病人健康的第三大杀手。

  3. It's a film about a serial killer and not for the faint-hearted.

  这部电影是讲一个连环杀手的,不适合胆小的人看。

  4. Heart disease is the biggest killer of men in most developed countries.

  在多数发达国家,心脏病是导致人们死亡的头号杀手。

  5. A hit man had been sent to silence her over the affair.

  为了掩盖这件事,已经派出一名职业杀手去杀她灭口。

  6. Heart disease is the biggest killer, claiming 180,000 lives a year.

  心脏病是头号杀手,每年夺去18万条生命。

  7. Police are theorizing that the killers may be posing as hitchhikers.

  警方推测那些杀手可能会假装成搭便车的人。

  8. Other officers gave chase but the killers escaped.

  其他警官追了上去,可是杀手还是逃了。

  9. a cold and calculating killer

  一个工于心计的冷酷杀手

  10. It was the deadly striker's 11 th goal of the season.

  这是那个杀手前锋本赛季的第11个进球.

  11. He is a hired killer.

  他是一个受雇的杀手.

  12. They were professional killers who did in John.

  杀死约翰的这些人是职业杀手.

  13. She took out a contract on her ex - husband .

  她雇了杀手打算谋杀她 前夫.

  14. Cannibal killer Jeffrey Dahmer has been caught trying to hide a razor blade in his cell.

  食人杀手杰弗里·达默被抓到试图将一片剃须刀片藏在牢房里。

  15. Their cold-blooded killers had then dragged their lifeless bodies upstairs to the bathroom.

  那些冷血杀手那时已经将他们的尸体拖到楼上浴室里。

  杀手英文相关阅读:杀手机器人将是人类噩梦

  Mankind is a bloodthirsty species. According to Steven Pinker, the academic, for much of history being murdered by a fellow human was the leading cause of death. Civilisation is largely a tale of man’s violent instincts being progressively muffled. A part of this is the steady withdrawal of actual human flesh from the battle zone, with front lines gradually pulled apart by the advent of long-range artillery and air power, and the decline in the public’s tolerance for casualties.

  人类是一个嗜血的物种。根据学者史蒂文?平克(Steven Pinker)的说法,在历史的大部分时间里,被同类所杀是人类的头号死因。文明基本上是人的暴力本能被逐渐束缚住的故事。其中一个部分是有血有肉的人持续从战场撤出,前线逐渐被远程武器和空中军事力量拉远,公众对于伤亡的容忍程度也下降了。

  Arguably, America’s principal offensive weapon is the drone, firing on targets thousands of miles from where its controller safely sits. Given the pace of advance, it takes no imaginative leap to foresee machines displacing human agency altogether from the act of killing. Artificial brains already perform well in tasks hitherto regarded as the province of humans. Computers will be trusted with driving a car or diagnosing an illness. Algorithmic intelligence could therefore surpass the human sort for making the decision to kill.

  可以说,美国的主要进攻武器是无人机,操纵者安坐于千里之外对目标进行打击。考虑到技术进步之快,无需脑洞大开,我们就能预见到机器将可完全代替人类进行杀戮。在迄今仍被视为人类专属的活动领域里,人工大脑已有良好表现。电脑将被交托驾驶汽车或者诊断疾病的任务。因此,在做出杀戮的决策上,算法智能或许也将超越人类智能。

  This prospect has prompted more than 1,000 artificial intelligence experts to write calling for the development of “lethal, autonomous weapons systems” to cease forthwith. Act now, they urge, or what they inevitably dub “killer robots” will be as widespread, and as deadly, as the Kalashnikov rifle.

  这种可能性,促使1000多名人工智能专家在一封公开信中呼吁立即停止发展“致命自动武器系统”。他们敦促称,现在就行动,否则被他们不可避免地称为“杀手机器人”的武器将和卡拉什尼科夫步枪(Kalashnikov,即AK-47)一样广为流传并造成同样致命的危害。

  It is easy to understand military enthusiasm for robotic warfare. Soldiers are precious, expensive and fallible. Every conflict exacts a heavy toll from avoidable human error. Machines in contrast neither grow weary nor lose patience. They can be sent into places unsafe or even impossible for ordinary soldiers. Rapid improvements in computational power are giving machines “softer” skills, such as the ability to identify an individual, flesh-and-blood target. Robots could eventually prove safer than even the most experienced soldier, for example by being capable of picking out a gunman from a crowd of children — then shooting him.

  军方对机器人战争的热衷很容易理解。士兵是宝贵的、成本高昂的,也是会犯错误的。本可避免的人为失误在每一场战斗中都造成了严重伤亡。相较之下,机器既不知疲倦,也不会失去耐心。它们可以被送往不安全甚至普通士兵无法到达的地方。计算能力的迅速提升正赋予机器“更柔软”的技能,比如识别一个有血有肉的单独目标。最终,事实可能将证明机器人会比最有经验的士兵更安全,比如能够从一群孩子中挑出枪手——然后射杀他。

  The case against robotic warfare is the same that applies to all advances in weaponry, the avoidance of unforeseeable consequences that cause unlimited damage to the innocent. Whatever precautions are taken, there is no foolproof way to stop weapons falling into the wrong hands. For a glimpse into what could go wrong, recall how Chrysler, the US carmaker has needed to debug 1.4m vehicles after finding the car could be remotely hacked. Now imagine it came equipped with guns.

  反对机器人战争的理由与反对所有武器进步的理由相同——避免大量无辜受到伤害这种不可预知的后果。无论采取了什么预防措施,都没有万无一失的方法来阻止武器落入不法之徒的手中。要想一窥那种情况下会有什么后果,可以回忆一下美国汽车制造商克莱斯勒(Chrysler)在发现汽车可以被远程入侵后,需要检测和排除140万辆汽车隐患的事情。现在,想象一下这些车装备了枪支。

  Technological futurists also fret about the exponential nature of advances in artificial intelligence. The scientist Stephen Hawking recently warned of the “technological catastrophe” that would follow artificial intelligence vastly exceeding the human sort. Whether this is a plumb inevitability or fantasy, science itself cannot decide: but in light of the risk, how sensible can it be to arm such super-intelligences?

  技术未来主义者也担忧人工智能异常快速的发展。科学家斯蒂芬?霍金(Stephen Hawking)最近提醒人们警惕人工智能远超人类智能后可能发生的“科技大灾难”。这到底是绝对无法避免的事情,还是只是幻想,科学本身无法确定:但考虑到其中的风险,给超级智能装备武器能有多明智呢?

  The moral argument is more straightforward. The abhorrence of killing has been as important to its decline as any technological breakthrough. Inserting artificial intelligence into the causal chain would muddle the responsibility that must underpin any decision to kill. Without clear responsibility, not only might the means to wage war be enhanced, but so too might the appetite for doing so.

  道德方面的理由更为直接。在减少杀人方面,对杀戮的厌恶是个重要因素,其作用不亚于任何技术突破。将人工智能插入这条因果链,将弄混杀人决定背后的责任。没有明确的责任,不仅发动战争的手段得到加强,发动战争的意愿也可能上升。

  Uninventing weapons is impossible: consider anti-personnel landmines — autonomous weapons in their way — which are still killing 15,000-20,000 people annually. The nature of artificial intelligence renders it impossible to foresee where the development of autonomous weapons would end. No amount of careful programming could limit the consequences. Far better not to embark on such a journey.

  让武器消失是不可能的:想一想杀伤性地雷——一种自动运行的武器——现在依然每年造成1.5万到2万人丧生。人工智能的性质使人们无法预见自动武器发展的终点在哪里。不管进行多少精密的编程,也无法限制其后果。最好不要踏上这样的旅程。

转载请备注原文地址:https://www.t7t8.net/jiaoan/151022.html
移动端网站原文地址:https://m.t7t8.net/jiaoan/151022.html