Chat YouTube:
該視頻討論了使用人工智能將腦部掃描轉化為文本,從而有效地讀取某人的想法。研究人員創建了人腦的 3D 視圖,並使用 AI 將活動區域翻譯成英語,生成看起來像是內心對話的內容。該技術可能會幫助中風患者在患有阻塞綜合症或其他無法正常說話的情況下進行交流。然而,人們擔心精神隱私和思想自由。


Q:人工智能技術如何將腦部掃描圖轉化為文本?
Chat YouTube:
AI 技術通過創建人腦的 3D 視圖並使用 AI 將活動區域翻譯成英語,從而將腦部掃描轉化為文本。 人工智能能夠處理大腦活動並生成看起來像是內心對話的東西。


Q:這項技術有什麼潛在好處,對誰有好處?
Chat YouTube:
該技術有可能幫助中風患者在患有阻塞綜合症或其他無法正常說話的情況下進行交流。 它還可以幫助那些患有 ALS 或其他疾病的人能夠用他們的思想與外界交流。 此外,該技術可能在日常應用中用於跟踪大腦健康和日常注意力。


Q:將這項技術用於精神隱私和思想自由有哪些擔憂?
Chat YouTube:
將這項技術用於精神隱私和思想自由的一個擔憂是,如果使用便攜式設備可以進行日常讀心術,它可能會危及精神隱私。 另一個擔憂是,該技術可能會被未經授權的個人或組織濫用,用於惡意目的,例如在未經他人同意的情況下提取他們的思想。 圍繞該技術的使用還存在倫理方面的考慮,例如誰可以訪問該信息以及如何使用這些信息。


Q:AI將腦部掃描轉化為文本.如何應用於腦機設備
Chat YouTube:
AI將腦部掃描轉化為文本的技術,可以應用於腦機設備中,幫助那些因為失去語言能力或其他神經系統疾病而無法正常交流的人 。通過安裝腦機設備,他們可以使用腦部活動來控制電腦或其他外部裝置,並通過AI轉換為文本來進行交流。 例如,他們可以通過腦機設備傳遞特定的訊息或控制外部裝置,從而獲得更多的自主權和自主性。 此外,AI還可以通過分析腦部活動,幫助醫生更好地診斷和治療患者,改善他們的生活質量。不過,使用這種技術還需要更多的研究和探索,以確保其安全性和隱私性。


0:00 all right get ready because tonight's 0:02 future of everything is mind-bending or 0:05 actually mind reading because it's 0:08 looking a lot more like AI is capable of 0:10 reading our thoughts and we're not just 0:11 talking about it predicting what kind of 0:13 tick tock we want to see next or what 0:15 kind of products we might buy this next 0:17 breakthrough is a lot deeper all the way 0:20 down to the very tissue of our brains 0:22 where researchers have used AI to 0:25 translate brain scans into text now for 0:28 the study scientists at UT Austin 0:30 created a 3D view of a person's brain 0:32 while they listened watched or imagined 0:34 a story and see we've got it for you 0:36 right there that pink stuff all that 0:38 pink stuff means above average brain 0:40 activity and the blue spots have below 0:42 average brain activity and to us we see 0:45 that we're like okay that part's 0:46 activated that part's not but 0:49 researchers say the AI was able to read 0:51 the brain waves and translate all of 0:53 that that you're seeing right there into 0:55 English and turn it into what looks like 0:57 inner dialogue 1:02 this is a real Leap Forward compared to 1:04 what's been done before which is 1:06 typically on like single words or short 1:08 sentences where you can get something 1:10 about the gist of a sentence or 1:11 something about you know what a word is 1:14 related to uh but we're getting whole 1:16 strings of text pretty complicated ideas 1:21 okay so what does that actually look 1:23 like check this out here are the 1:25 volunteers and the volunteers were asked 1:27 to watch a movie clip without audio and 1:30 this is what the AI described you're 1:32 seeing it right there actual stimulus 1:34 and they described I see a girl that 1:36 looks just like me get hit on her back 1:38 and then she is knocked off end quote 1:41 the I saw it the brain processed it and 1:44 the AI top typed out what they were 1:46 thinking I just like I'm still trying to 1:49 figure out how that would even work with 1:51 us now is need a farahani she is a 1:54 futurist an ethicist a law professor at 1:56 Duke University the author of battle for 1:59 your brain defending the right to think 2:01 freely in the age of neurotechnology uh 2:04 welcome doctor so researchers are saying 2:07 that they hope that this could one day 2:09 help victims of of like a stroke but how 2:12 do you see this type of Technology being 2:14 used 2:15 I think it has tremendous promise and 2:17 that it actually offers potentially 2:19 people who have blocked in syndrome 2:21 suffering from ALS or other conditions 2:23 to be able to have their thoughts 2:26 interpreted into speech so they could 2:28 communicate with the outside world it 2:30 also has a potential for everyday people 2:33 to use it and when that starts to happen 2:36 I start to worry about what it looks 2:37 like for mental privacy and for freedom 2:39 of thought of individuals 2:41 now just to be clear like in this case 2:43 people are sticking electrodes I mean 2:46 there's a lot of monitors 2:48 one of the things yeah 2:50 yeah one of the things that's 2:51 extraordinary about this is there are no 2:52 electrodes this is a big machine it's a 2:55 functional magnetic resonance imaging 2:57 machine that means that people went into 2:59 this machine uh non and basically 3:01 without any electrodes inside of the 3:03 brain that they were able to pick up 3:05 blood flow changes in their brain and 3:07 one of the things that's incredibly 3:08 remarkable about this study is not only 3:10 the Precision of what they picked up but 3:12 then they decided to test and see if the 3:14 model that they created which was using 3:17 gpt1 generative AI we're all talking 3:19 about chat gbt these days 3:22 um they wanted to see if they could use 3:23 that same classifier on a portable 3:25 system not a big bulky device like fmri 3:28 so they tried it out on something called 3:30 f nears functional near infrared 3:33 spectroscopy which is a wearable device 3:35 something that people could use in their 3:37 everyday lives to track their brain 3:39 health to track their 3:40 um you know everyday focus and mind 3:42 wandering they found that they could get 3:45 the same level of precision using that 3:48 kind of portable device with the 3:49 classifier that they had developed in 3:51 other words everyday mind reading may be 3:53 possible using devices that we may come 3:56 to use in our everyday lives 3:58 uh is non-invasive and GPT one so this 4:03 was done before gpt4 we don't know 4:07 what's on the horizon uh I mean your 4:09 book talks about mental privacy uh what 4:11 are your concerns when it comes to Ai 4:14 and other neurotechnologies that that 4:16 could infringe on that 4:18 so the coming feature that I see is the 4:21 use of everyday brain wearables people 4:23 are used to wearing for example sensors 4:25 that track their heart rate or track 4:27 their sleep activity in the form of a 4:29 watch or a ring now companies like meta 4:33 and snap are starting to buy and develop 4:35 their own neurotechnology to put sensors 4:37 that track brain activity into everyday 4:39 devices like earbuds or headphones that 4:42 people could use to take a conference 4:44 call but also track their brain activity 4:46 which can be promising for things like 4:48 tracking brain health or tracking focus 4:50 and attention but it also means that the 4:52 very same corporations who modifying all 4:55 of our personal data are suddenly going 4:57 to have access to what's happening in 4:59 our own brains and then I fear that our 5:02 right to mental privacy our right to 5:04 freedom of thought could substantially 5:05 be interfered with we've already seen 5:07 that kind of interference with things 5:09 like Cambridge analytica when you can 5:11 start to have much more precise access 5:13 to what people are thinking and feeling 5:15 and then change what they're seeing in 5:17 their social media or the kinds of 5:19 algorithms that are feeding different 5:21 information to them and the platforms 5:24 that they interact with I worry that 5:26 there's a future that we're just not 5:27 ready for 5:29 yeah Anita I feel like we should be 5:31 talking to you every single day I hope 5:33 you come back uh thank you so much for 5:35 this conversation and hopefully it's 5:36 it's one of many more to come thanks so 5:39 much for joining us thanks for having 5:40 that does it for us tonight 5:44 thanks for watching our YouTube channel 5:46 follow today's top stories and breaking 5:48 news by downloading the NBC News app
arrow
arrow
    全站熱搜
    創作者介紹
    創作者 AI芯視界 的頭像
    AI芯視界

    AI芯視界

    AI芯視界 發表在 痞客邦 留言(0) 人氣()