<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>brainwaves Archives - Stuff South Africa</title>
	<atom:link href="https://stuff.co.za/tag/brainwaves/feed/" rel="self" type="application/rss+xml" />
	<link>https://stuff.co.za/tag/brainwaves/</link>
	<description>South Africa&#039;s Technology News Hub</description>
	<lastBuildDate>Thu, 31 Aug 2023 08:24:10 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<generator>https://wordpress.org/?v=6.9.4</generator>

 
<atom:link rel="hub" href="https://pubsubhubbub.appspot.com"/>
<atom:link rel="hub" href="https://pubsubhubbub.superfeedr.com"/>
<atom:link rel="hub" href="https://websubhub.com/hub"/>
<atom:link rel="self" href="https://stuff.co.za/tag/brainwaves/feed/"/>
	<item>
		<title>AI can hear your brainwaves and tell you what music you’re listening to</title>
		<link>https://stuff.co.za/2023/08/31/ai-turns-brainwaves-into-music/</link>
		
		<dc:creator><![CDATA[Toby Shapshak]]></dc:creator>
		<pubDate>Thu, 31 Aug 2023 08:24:10 +0000</pubDate>
				<category><![CDATA[AI News]]></category>
		<category><![CDATA[News]]></category>
		<category><![CDATA[Other Tech News]]></category>
		<category><![CDATA[AI]]></category>
		<category><![CDATA[brain scans]]></category>
		<category><![CDATA[brainwaves]]></category>
		<category><![CDATA[Google AI]]></category>
		<category><![CDATA[music]]></category>
		<guid isPermaLink="false">https://stuff.co.za/?p=182793</guid>

					<description><![CDATA[<p>Many of us know that awful brain trick called an earworm – when you just can’t get a song (usually a bad one) out of your head. Often, you can’t quite remember what the name of the song is, or who sang it. But now researchers have used artificial intelligence (AI) to examine someone’s brain [...]</p>
<p>The post <a href="https://stuff.co.za/2023/08/31/ai-turns-brainwaves-into-music/">AI can hear your brainwaves and tell you what music you’re listening to</a> appeared first on <a href="https://stuff.co.za">Stuff South Africa</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p>Many of us know that awful brain trick called an earworm – when you just can’t get a song (usually a bad one) out of your head. Often, you can’t quite remember what the name of the song is, or who sang it. But now researchers have used <a href="http://stuff.co.za/tag/ai" target="_blank" rel="noopener">artificial intelligence</a> (AI) to examine someone’s brain waves, as it were, and tell you what it is.</p>
<p>Well, create a similar song from the same genre that has the same rhythm and mood and which instruments were used.</p>
<p>Yes, really.</p>
<h3>Don&#8217;t rack your brain, make AI do it</h3>
<p>The researchers created software called Brain2Music that scans your brain and then uses this imaging data to create snippets of the song being listened to. Although the <a href="https://arxiv.org/abs/2307.11078" target="_blank" rel="noopener">research</a> hasn’t been peer-reviewed, it is the latest in a number of interesting studies about what can be gleaned from brainwaves.</p>
<p>&#8220;The agreement, in terms of the mood of the reconstructed music and the original music, was around 60%,&#8221; said Timo Denk, who co-authored the paper.</p>
<p>&#8220;The method is pretty robust across the five subjects we evaluated. If you take a new person and train a model for them, it&#8217;s likely that it will also work well,&#8221; Denk, who is a software engineer at <a href="http://stuff.co.za/tag/google">Google</a> in Switzerland, told <em>Live Science</em>.</p>
<p>They used a highly sophisticated functional magnetic resonance imaging (fMRI) machine – that shows which parts of the brain are firing by examining where blood is concentrated.</p>
<p>They scanned five people while they listened to 15-second clips of classical music, blues, country, disco, hip-hop, jazz and pop songs. Then Denk and his team trained the AI model to analyse the data for instruments and genre, as well as its rhythm and mood (happy, sad, angry, exciting).</p>
<p>After customising the algorithm for each participant, the researchers were able to reconstruct the music being played. Unsurprisingly, classical music produced the best results.</p>
<p>All of this from brain scans. Wow.</p>
<p><a href="https://www.yahoo.com/lifestyle/googles-mind-reading-ai-tell-154627617.html" target="_blank" rel="noopener"><em>Source</em></a></p>
<p>The post <a href="https://stuff.co.za/2023/08/31/ai-turns-brainwaves-into-music/">AI can hear your brainwaves and tell you what music you’re listening to</a> appeared first on <a href="https://stuff.co.za">Stuff South Africa</a>.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Got Zoom fatigue? Out-of-sync brainwaves could be another reason videoconferencing is such a drag</title>
		<link>https://stuff.co.za/2021/12/19/got-zoom-fatigue-out-of-sync-brainwaves-could-be-another-reason-videoconferencing-is-such-a-drag/</link>
		
		<dc:creator><![CDATA[The Conversation]]></dc:creator>
		<pubDate>Sun, 19 Dec 2021 12:44:39 +0000</pubDate>
				<category><![CDATA[News]]></category>
		<category><![CDATA[Other Tech News]]></category>
		<category><![CDATA[brainwaves]]></category>
		<category><![CDATA[featured]]></category>
		<category><![CDATA[psychology]]></category>
		<category><![CDATA[Remote work]]></category>
		<category><![CDATA[video calls]]></category>
		<category><![CDATA[zoom]]></category>
		<guid isPermaLink="false">https://stuff.co.za/?p=138936</guid>

					<description><![CDATA[<p>During the pandemic, video calls became a way for me to connect with my aunt in a nursing home and with my extended family during holidays. Zoom was how I enjoyed trivia nights, happy hours and live performances. As a university professor, Zoom was also the way I conducted all of my work meetings, mentoring [...]</p>
<p>The post <a href="https://stuff.co.za/2021/12/19/got-zoom-fatigue-out-of-sync-brainwaves-could-be-another-reason-videoconferencing-is-such-a-drag/">Got Zoom fatigue? Out-of-sync brainwaves could be another reason videoconferencing is such a drag</a> appeared first on <a href="https://stuff.co.za">Stuff South Africa</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p>During the pandemic, video calls became a way for me to connect with my aunt in a nursing home and with my extended family during holidays. Zoom was how I enjoyed trivia nights, happy hours and live performances. As a university professor, Zoom was also the way I conducted all of my work meetings, mentoring and teaching.</p>
<p>But I often felt drained after Zoom sessions, even some of those that I had scheduled for fun. <a href="https://news.stanford.edu/2021/02/23/four-causes-zoom-fatigue-solutions/">Several well-known factors</a> – intense eye contact, slightly misaligned eye contact, being on camera, limited body movement, lack of nonverbal communication – contribute to Zoom fatigue. But I was curious about why conversation felt more laborious and awkward over Zoom and other video-conferencing software, compared with in-person interactions.</p>
<p>As a researcher who <a href="https://scholar.google.com/citations?user=8j4_-aYAAAAJ&amp;hl=en">studies psychology and linguistics</a>, I decided to examine the impact of video-conferencing on conversation. Together with three undergraduate students, I ran <a href="https://doi.apa.org/doi/10.1037/xge0001150">two experiments</a>.</p>
<p>The first experiment found that response times to prerecorded yes/no questions more than tripled when the questions were played over Zoom instead of being played from the participant’s own computer.</p>
<p>The second experiment replicated the finding in natural, spontaneous conversation between friends. In that experiment, transition times between speakers averaged 135 milliseconds in person, but 487 milliseconds for the same pair talking over Zoom. While under half a second seems pretty quick, that difference is an eternity in terms of natural conversation rhythms.</p>
<p>We also found that people held the floor for longer during Zoom conversations, so there were fewer transitions between speakers. These experiments suggest that the natural rhythm of conversation is disrupted by videoconferencing apps like Zoom.</p>
<h2>Cognitive anatomy of a conversation</h2>
<p>I already had some expertise in studying conversation. Pre-pandemic, I conducted several experiments investigating how topic shifts and working memory load affect the timing of when speakers in a conversation take turns.</p>
<p>In that research, I found that <a href="https://cogsci.mindmodeling.org/2019/papers/0048/index.html">pauses between speakers were longer</a> when the two speakers were talking about different things, or if a speaker was distracted by another task while conversing. I originally became interested in the timing of turn transitions because planning a response during conversation is a complex process that people accomplish with lightning speed.</p>
<p>The average pause between speakers in two-party conversations is about one-fifth of a second. In comparison, it takes more than a half-second to <a href="https://doi.org/10.1080/00140139508925238">move your foot from the accelerator to the brake</a> while driving – more than twice as long.</p>
<p>The speed of turn transitions indicates that listeners don’t wait until the end of a speaker’s utterance to begin planning a response. Rather, listeners simultaneously comprehend the current speaker, plan a response and predict the appropriate time to initiate that response. All of this multitasking ought to make conversation quite laborious, but it is not.</p>
<h2>Getting in sync</h2>
<p>Brainwaves are the rhythmic firing, or oscillation, of neurons in your brain. These oscillations may be one factor that helps make conversation effortless. <a href="https://doi.org/10.1017/9781108610728">Several</a> <a href="https://doi.org/10.3758/BF03206432">researchers</a> have proposed that a neural oscillatory mechanism automatically synchronizes the firing rate of a group of neurons to the speech rate of your conversation partner. This oscillatory timing mechanism would relieve some of the mental effort in planning when to begin speaking, especially if it was <a href="https://doi.org/10.7554/eLife.68066">combined with predictions</a> about the remainder of your partner’s utterance.</p>
<p>While there are many open questions about how oscillatory mechanisms affect perception and behavior, there is <a href="https://doi.org/10.3389/fpsyg.2012.00320">direct</a> <a href="https://doi.org/10.1038/nn.4186">evidence</a> for neural oscillators that track syllable rate when syllables are presented at regular intervals. For example, when you hear syllables four times a second, the electrical activity in your brain <a href="https://doi.org/10.1038/nn.4186">peaks at the same rate</a>.</p>
<figure class="align-center zoomable">
<div class="placeholder-container"><img decoding="async" class=" ls-is-cached lazyloaded" src="https://images.theconversation.com/files/435178/original/file-20211201-15-how79x.jpg?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=754&amp;fit=clip" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px" srcset="https://images.theconversation.com/files/435178/original/file-20211201-15-how79x.jpg?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=600&amp;h=115&amp;fit=crop&amp;dpr=1 600w, https://images.theconversation.com/files/435178/original/file-20211201-15-how79x.jpg?ixlib=rb-1.1.0&amp;q=30&amp;auto=format&amp;w=600&amp;h=115&amp;fit=crop&amp;dpr=2 1200w, https://images.theconversation.com/files/435178/original/file-20211201-15-how79x.jpg?ixlib=rb-1.1.0&amp;q=15&amp;auto=format&amp;w=600&amp;h=115&amp;fit=crop&amp;dpr=3 1800w, https://images.theconversation.com/files/435178/original/file-20211201-15-how79x.jpg?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=754&amp;h=145&amp;fit=crop&amp;dpr=1 754w, https://images.theconversation.com/files/435178/original/file-20211201-15-how79x.jpg?ixlib=rb-1.1.0&amp;q=30&amp;auto=format&amp;w=754&amp;h=145&amp;fit=crop&amp;dpr=2 1508w, https://images.theconversation.com/files/435178/original/file-20211201-15-how79x.jpg?ixlib=rb-1.1.0&amp;q=15&amp;auto=format&amp;w=754&amp;h=145&amp;fit=crop&amp;dpr=3 2262w" alt="A spectrograph of human speech with a rough sine wave overlaid on it" data-src="https://images.theconversation.com/files/435178/original/file-20211201-15-how79x.jpg?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=754&amp;fit=clip" data-srcset="https://images.theconversation.com/files/435178/original/file-20211201-15-how79x.jpg?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=600&amp;h=115&amp;fit=crop&amp;dpr=1 600w, https://images.theconversation.com/files/435178/original/file-20211201-15-how79x.jpg?ixlib=rb-1.1.0&amp;q=30&amp;auto=format&amp;w=600&amp;h=115&amp;fit=crop&amp;dpr=2 1200w, https://images.theconversation.com/files/435178/original/file-20211201-15-how79x.jpg?ixlib=rb-1.1.0&amp;q=15&amp;auto=format&amp;w=600&amp;h=115&amp;fit=crop&amp;dpr=3 1800w, https://images.theconversation.com/files/435178/original/file-20211201-15-how79x.jpg?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=754&amp;h=145&amp;fit=crop&amp;dpr=1 754w, https://images.theconversation.com/files/435178/original/file-20211201-15-how79x.jpg?ixlib=rb-1.1.0&amp;q=30&amp;auto=format&amp;w=754&amp;h=145&amp;fit=crop&amp;dpr=2 1508w, https://images.theconversation.com/files/435178/original/file-20211201-15-how79x.jpg?ixlib=rb-1.1.0&amp;q=15&amp;auto=format&amp;w=754&amp;h=145&amp;fit=crop&amp;dpr=3 2262w" /></div>
<div class="enlarge_hint"></div><figcaption><span class="caption">This acoustic spectrogram of the utterance ‘Do you think surfers are scared of being bitten by a shark?’ has an overlaid oscillatory function (blue wave). This shows that midpoints of most syllables (numbered hash marks) occur at or near the wave troughs, regardless of syllable length. The hash marks were generated with a Praat script written by deJong and Wempe.</span> <span class="attribution"><span class="source">Julie Boland</span>, <a class="license" href="http://creativecommons.org/licenses/by-nd/4.0/">CC BY-ND</a></span></figcaption></figure>
<p>There is also evidence that <a href="https://doi.org/10.1093/oso/9780190618216.001.0001">oscillators can accommodate some variability</a> in syllable rate. This makes the notion that an automatic neural oscillator could track the fuzzy rhythms of speech plausible. For example, an oscillator with a period of 100 milliseconds could keep in sync with speech that varies from 80 milliseconds to 120 milliseconds per short syllable. Longer syllables are not a problem if their duration is a multiple of the duration for short syllables.</p>
<h2>Internet lag is a wrench in the mental gears</h2>
<p>My hunch was that this proposed oscillatory mechanism couldn’t function very well over Zoom due to variable transmission lags. In a video call, the audio and video signals are split into packets that zip across the internet. In our studies, each packet took around 30 to 70 milliseconds to travel from sender to receiver, including disassembly and reassembly.</p>
<p>While this is very fast, it adds too much additional variability for brainwaves to sync with speech rates automatically, and more arduous mental operations have to take over. This could help explain my sense that Zoom conversations were more fatiguing than having the same conversation in person would have been.</p>
<p><a href="https://doi.apa.org/doi/10.1037/xge0001150">Our experiments</a> demonstrated that the natural rhythm of turn transitions between speakers is disrupted by Zoom. This disruption is consistent with what would happen if the neural ensemble that <a href="https://doi.org/10.1093/oso/9780190618216.001.0001">researchers believe normally synchronizes with speech</a> fell out of sync due to electronic transmission delays.</p>
<p>Our evidence supporting this explanation is indirect. We did not measure cortical oscillations, nor did we manipulate the electronic transmission delays. Research into the connection between neural oscillatory timing mechanisms and speech in general <a href="https://doi.org/10.1038/s41583-020-0304-4">is promising</a> but not definitive.</p>
<p>Researchers in the field need to pin down an oscillatory mechanism for naturally occurring speech. From there, cortical tracking techniques could show whether such a mechanism is more stable in face-to-face conversations than with video-conferencing conversations, and how much lag and how much variability cause disruption.</p>
<p>Could the syllable-tracking oscillator tolerate relatively short but realistic electronic lags below 40 milliseconds, even if they varied dynamically from 15 to 39 milliseconds? Could it tolerate relatively long lags of 100 milliseconds if the transmission lag were constant instead of variable?</p>
<p>The knowledge gained from such research could open the door to technological improvements that help people get in sync and make videoconferencing conversations less of a cognitive drag.</p>
<ul>
<li><a href="https://theconversation.com/profiles/julie-boland-248289" rel="author"><span class="fn author-name">Julie Boland</span></a> is a Professor of Psychology and Linguistics, University of Michigan</li>
<li>This article first appeared on <a href="https://theconversation.com/got-zoom-fatigue-out-of-sync-brainwaves-could-be-another-reason-videoconferencing-is-such-a-drag-172380"><em>The Conversation</em></a></li>
</ul>
<p><iframe src="https://counter.theconversation.com/content/172380/count.gif?distributor=republish-lightbox-advanced" width="1" height="1"></iframe></p>
<p>The post <a href="https://stuff.co.za/2021/12/19/got-zoom-fatigue-out-of-sync-brainwaves-could-be-another-reason-videoconferencing-is-such-a-drag/">Got Zoom fatigue? Out-of-sync brainwaves could be another reason videoconferencing is such a drag</a> appeared first on <a href="https://stuff.co.za">Stuff South Africa</a>.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>South Korea trials brainwave sensors for bus drivers to improve safety</title>
		<link>https://stuff.co.za/2021/12/07/south-korea-bus-brainwaves/</link>
		
		<dc:creator><![CDATA[Max Milella]]></dc:creator>
		<pubDate>Tue, 07 Dec 2021 10:45:15 +0000</pubDate>
				<category><![CDATA[News]]></category>
		<category><![CDATA[Other Tech News]]></category>
		<category><![CDATA[brain tech]]></category>
		<category><![CDATA[brainwaves]]></category>
		<category><![CDATA[bus]]></category>
		<category><![CDATA[bus drivers]]></category>
		<category><![CDATA[featured]]></category>
		<category><![CDATA[Hyundai]]></category>
		<category><![CDATA[South Korea]]></category>
		<guid isPermaLink="false">https://stuff.co.za/?p=138093</guid>

					<description><![CDATA[<p>As part of a &#8216;Safer Public Bus&#8217; campaign, the South Korean province of Gyeonnggi is looking to improve the safety of its bus systems by monitoring the brain waves of drivers. Drivers equipped with the brainwave sensors, developed by Hyundai, will receive a number of audio-visual alerts in response to brain activity indicating that they [...]</p>
<p>The post <a href="https://stuff.co.za/2021/12/07/south-korea-bus-brainwaves/">South Korea trials brainwave sensors for bus drivers to improve safety</a> appeared first on <a href="https://stuff.co.za">Stuff South Africa</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p>As part of a &#8216;Safer Public Bus&#8217; campaign, the South Korean province of Gyeonnggi is looking to improve the safety of its bus systems by monitoring the brain waves of drivers. Drivers equipped with the brainwave sensors, developed by Hyundai, will receive a number of audio-visual alerts in response to brain activity indicating that they are driving unsafely.</p>
<h3><strong>Getting into the mind of a bus driver</strong></h3>
<p>This sounds a little intrusive at first, and it would be if there were some governmental mandate forcing drivers to shove <a href="https://stuff.co.za/2019/07/17/afraid-of-mind-control-better-not-check-out-elon-musks-neuralink/">Neuralink</a> chips into their ears, but that&#8217;s not what this is. This system is currently in a trial phase, and will be until the end of the year. It also operates entirely on a volunteer basis.</p>
<p><em>Cities Today</em><a href="https://thenextweb.com/news/bus-drivers-brainwaves-monitored-south-korean-safety-pilot-syndication"> (via <em>The Next Web</em>)</a> reports that so far 20 drivers have stepped up to participate.</p>
<p>The Hyundai-made sensors are fitted to the drivers&#8217; ears, and monitor brain activity. Should they pick up indicators of anything that may impair a driver&#8217;s ability (such as drowsiness), they&#8217;ll signal a number of systems that will provide the driver with an audio and/or visual alert (such as flashing LEDs and smartphone notifications) that their driving is dangerous.</p>
<p>&#8220;Accidents occur many times because of drivers’ drowsiness, stress, or careless driving when they are behind the wheel,&#8221; said Bus Policy Division Manager Chang-hee Cho to<em> Cities Today.</em></p>
<p>&#8220;This ear-set sensor will help reduce the possibility of accidents if the driver is stressed or tired, and contribute to making driving safer.&#8221;</p>
<p>The Gyeonggi government is working closely with trialing drivers to improve the system.</p>
<p>&#8220;After [the initial test], we will conduct a survey of bus drivers who participated in the test and hear diverse voices from experts and those involved to decide whether to make further expansions.&#8221;</p>
<p>The post <a href="https://stuff.co.za/2021/12/07/south-korea-bus-brainwaves/">South Korea trials brainwave sensors for bus drivers to improve safety</a> appeared first on <a href="https://stuff.co.za">Stuff South Africa</a>.</p>
]]></content:encoded>
					
		
		
			</item>
	</channel>
</rss>
