<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>Intelligent Systems and Assistive Technologies Lab</title>
	<atom:link href="https://Engineering.Purdue.Edu/isat/feed/" rel="self" type="application/rss+xml" />
	<link>https://Engineering.Purdue.Edu/isat</link>
	<description>Bridging the gap between humans and robots</description>
	<lastBuildDate>Fri, 07 Feb 2020 17:16:19 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>hourly</sy:updatePeriod>
	<sy:updateFrequency>1</sy:updateFrequency>
	<generator>https://wordpress.org/?v=4.1.42</generator>
	<item>
		<title>Paper related to quantification of gestures accepted at HFES 2018</title>
		<link>https://Engineering.Purdue.Edu/isat/paper-related-to-quantification-of-gestures-accepted-at-hfes-2018/</link>
		<comments>https://Engineering.Purdue.Edu/isat/paper-related-to-quantification-of-gestures-accepted-at-hfes-2018/#comments</comments>
		<pubDate>Tue, 15 May 2018 21:49:36 +0000</pubDate>
		<dc:creator><![CDATA[ISAT]]></dc:creator>
				<category><![CDATA[News]]></category>

		<guid isPermaLink="false">https://Engineering.Purdue.Edu/isat/?p=428</guid>
		<description><![CDATA[A paper authored by ISAT members, Glebys Gonzalez, Naveen Madapana and Rahul Taneja is accepted and is in press at International Annual Meeting, HFES 2018.<p class="continue-reading-button"> <a class="continue-reading-link" href="https://Engineering.Purdue.Edu/isat/paper-related-to-quantification-of-gestures-accepted-at-hfes-2018/">Continue reading<i class="crycon-right-dir"></i></a></p>]]></description>
				<content:encoded><![CDATA[<p>A paper authored by ISAT members, Glebys Gonzalez, Naveen Madapana and Rahul Taneja is accepted and is in press at International Annual Meeting, HFES 2018.</p>
]]></content:encoded>
			<wfw:commentRss>https://Engineering.Purdue.Edu/isat/paper-related-to-quantification-of-gestures-accepted-at-hfes-2018/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>Two papers accepted at ICPR 2018</title>
		<link>https://Engineering.Purdue.Edu/isat/two-papers-accepted-at-icpr-2018/</link>
		<comments>https://Engineering.Purdue.Edu/isat/two-papers-accepted-at-icpr-2018/#comments</comments>
		<pubDate>Tue, 01 May 2018 21:39:46 +0000</pubDate>
		<dc:creator><![CDATA[ISAT]]></dc:creator>
				<category><![CDATA[News]]></category>

		<guid isPermaLink="false">https://Engineering.Purdue.Edu/isat/?p=424</guid>
		<description><![CDATA[Two papers authored by Mr. Naveen Madapana and Mrs. Ting Zhang are accepted and are in press at 23rd IEEE International conference on Pattern Recognition.<p class="continue-reading-button"> <a class="continue-reading-link" href="https://Engineering.Purdue.Edu/isat/two-papers-accepted-at-icpr-2018/">Continue reading<i class="crycon-right-dir"></i></a></p>]]></description>
				<content:encoded><![CDATA[<p>Two papers authored by Mr. Naveen Madapana and Mrs. Ting Zhang are accepted and are in press at 23rd IEEE International conference on Pattern Recognition.</p>
]]></content:encoded>
			<wfw:commentRss>https://Engineering.Purdue.Edu/isat/two-papers-accepted-at-icpr-2018/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>Telerobotic Surgery with Free Hand Gestures</title>
		<link>https://Engineering.Purdue.Edu/isat/telerobotic-surgery-with-free-hand-gestures/</link>
		<comments>https://Engineering.Purdue.Edu/isat/telerobotic-surgery-with-free-hand-gestures/#comments</comments>
		<pubDate>Thu, 26 Apr 2018 19:53:54 +0000</pubDate>
		<dc:creator><![CDATA[ISAT]]></dc:creator>
				<category><![CDATA[Research]]></category>

		<guid isPermaLink="false">https://Engineering.Purdue.Edu/isat/?p=291</guid>
		<description><![CDATA[Description:  Current teleoperated robot-assisted surgery requires surgeons to manipulate joystick-like controllers in a master console, and robotic arms will mimic those motions on the patient&#8217;s side. It is becoming more popular compared to traditional minimally invasive surgery due to its dexterity, precision and accurate motion planning capabilities. However, one major…<p class="continue-reading-button"> <a class="continue-reading-link" href="https://Engineering.Purdue.Edu/isat/telerobotic-surgery-with-free-hand-gestures/">Continue reading<i class="crycon-right-dir"></i></a></p>]]></description>
				<content:encoded><![CDATA[<p style="text-align: justify;"><span style="color: #000000;"><strong><span style="font-size: x-large;">Description: </span></strong><br />
Current teleoperated robot-assisted surgery requires surgeons to manipulate joystick-like controllers in a master console, and robotic arms will mimic those motions on the patient&#8217;s side. It is becoming more popular compared to traditional minimally invasive surgery due to its dexterity, precision and accurate motion planning capabilities. However, one major drawback of such system is related with user experience, since the surgeon has to retrain extensively in order to learn how to operate cumbersome interfaces.</span></p>
<p style="text-align: justify;"><span style="color: #000000;">To address this problem, we have developed an innovative system to involve touchless interfaces for telesurgery. This type of solution, when applied to robotic surgery, has the potential to allow surgeons to operate as if they were physically engaged with the surgery in-situ (as standard in traditional surgery). By relying on touchless interfaces, the system can incorporate more natural gestures that are similar to instinctive movements performed by surgeons when operating, thus enhancing the user experience and overall system performance. Sensory substitution methods are used as well to deliver force feedback to the user during teleoperation.</span></p>
<p><span style="color: #000000;"><strong><span style="font-size: x-large;">Publications:</span></strong></span><br />
<span style="color: #000000;">Zhou, Tian, Cabrera, Maria Eugenia, Low, Thomas, Sundaram, Chandru &amp; Wachs, Juan (2016). </span><a href="http://humanrobotinteraction.org/journal/index.php/HRI/article/view/269" target="_blank">A Comparative Study for Telerobotic Surgery Using Free Hand Gestures</a><span style="color: #000000;">. <em>Journal of Human-Robot Interaction, 5</em>, 1-28.</span></p>
<p><span style="color: #000000;">Zhou, Tian<strong>.</strong>, Cabrera, Maria., &amp; Wachs, Juan. (2016).</span> <a href="http://link.springer.com/chapter/10.1007%2F978-3-319-12943-3_17" target="_blank">A Comparative Study for Touchless Telerobotic Surgery</a>. <span style="color: #000000;">In <em>Computer-Assisted Musculoskeletal Surgery</em> (pp. 235-255). Springer International Publishing.</span></p>
<p><span style="color: #000000;">Zhou, Tian., Cabrera, Maria., &amp; Wachs, Juan (2015, January). </span><a href="https://pdfs.semanticscholar.org/61e4/4410b0ddd5a72001b6ed34f941ae963ab73a.pdf" target="_blank">Touchless telerobotic surgery-is it possible at all?</a><span style="color: #000000;">. In <em>AAAI</em> (pp. 4228-4230).</span></p>
<p><span style="color: #000000;">Zhou, Tian., Cabrera, Maria., &amp; Wachs, Juan, (2015) <a href="https://www.dropbox.com/s/jgqmg385es740xi/Book%20of%20abstracts%20online.pdf?dl=0">Communication Modalities for Supervised Teleoperation in Highly Dexterous Tasks &#8211; Does one size fit all?</a>.<span style="color: #000000;"> In <em>2nd Workshop on the role of Human Sensorimotor Control in Surgical Robotics, in Intelligent Robots and Systems (IROS), 2015 IEEE/RSJ International Conference on.</em> IEEE<em>.</em></span></span></p>
<p><span style="color: #000000;">Zhou, Tian., Cabrera, Maria., &amp; Wachs, Juan, (2014)</span> <a href="http://inrol.snu.ac.kr/Telerobotics-CS.pdf">Touchless telerobotic surgery &#8211; A comparative study</a>.<span style="color: #000000;"> In <em>3rd Workshop on Telerobotics for Real-Life Applications, Opportunities, Challenges and New Developments, in </em><em>Intelligent Robots and Systems (IROS), 2015 IEEE/RSJ International Conference on.</em> IEEE<em>.</em></span></p>
<p><span style="color: #000000;"><strong><span style="font-size: x-large;">Videos:</span></strong></span></p>
<h2 class="wsite-content-title"><span style="color: #000000; font-size: large;">Incision task with Omega sensor</span></h2>
<p><iframe src="//www.youtube.com/embed/nu2WO7mZEgw" width="425" height="350"></iframe></p>
<h2 class="wsite-content-title"><span style="color: #000000; font-size: large;">Peg transfer with Leap Motion</span></h2>
<p><iframe src="//www.youtube.com/embed/NZseYREHI0U" width="425" height="350"></iframe></p>
<h2 class="wsite-content-title"><span style="color: #000000; font-size: large;">Threading task with Leap Motion</span></h2>
<p><iframe src="//www.youtube.com/embed/acDRttXBSEk" width="425" height="350"></iframe></p>
]]></content:encoded>
			<wfw:commentRss>https://Engineering.Purdue.Edu/isat/telerobotic-surgery-with-free-hand-gestures/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>Surgical Instrument Recognition via Vision and Manipulation</title>
		<link>https://Engineering.Purdue.Edu/isat/surgical-instrument-recognition-via-vision-and-manipulation/</link>
		<comments>https://Engineering.Purdue.Edu/isat/surgical-instrument-recognition-via-vision-and-manipulation/#comments</comments>
		<pubDate>Thu, 26 Apr 2018 19:49:00 +0000</pubDate>
		<dc:creator><![CDATA[ISAT]]></dc:creator>
				<category><![CDATA[Research]]></category>

		<guid isPermaLink="false">https://Engineering.Purdue.Edu/isat/?p=285</guid>
		<description><![CDATA[Description:  US hospitals are facing great shortage of registered nurses, which could lead to an increment in mortality rate. One solution to this challenge is to bring Robotic Scrub Nurse (RSN) into the Operating Room (OR) to free nurses from mundane and repetitive tasks such as instrument delivery and retrieval.…<p class="continue-reading-button"> <a class="continue-reading-link" href="https://Engineering.Purdue.Edu/isat/surgical-instrument-recognition-via-vision-and-manipulation/">Continue reading<i class="crycon-right-dir"></i></a></p>]]></description>
				<content:encoded><![CDATA[<p><span style="color: #000000;"><strong><span style="font-size: x-large;">Description: </span></strong></span></p>
<p style="text-align: justify;"><span style="font-size: medium; color: #000000;">US hospitals are facing great shortage of registered nurses, which could lead to an increment in mortality rate. One solution to this challenge is to bring Robotic Scrub Nurse (RSN) into the Operating Room (OR) to free nurses from mundane and repetitive tasks such as instrument delivery and retrieval.</span></p>
<p style="text-align: justify;"><span style="color: #000000;">As an important building block for RSN, this paper presents an accurate and robust surgical instrument recognition algorithm. Surgical instruments are often cluttered, occluded and display specular light, which causes a challenge for conventional recognition algorithms. A learning-through-interaction paradigm was proposed to tackle the challenge, which combines computer vision with robot manipulation and achieves active recognition. The unknown instrument is firstly segmented out as blobs and its poses estimated, then the RSN system picks it up and presents it to an optical sensor in a determined pose. </span>Lastly<span style="color: #000000;"> the unknown instrument is recognized with high confidence.</span></p>
<p style="text-align: justify;"><span style="color: #000000;">Experiments were then conducted to evaluate the performance of the proposed segmentation and recognition algorithms, respectively. It is found out that the proposed patch-based segmentation algorithm and attention-based recognition algorithm greatly outperform their benchmark comparisons, proving the applicability and effectiveness of </span>a RSN<span style="color: #000000;"> to perform accurate and robust surgical instrument recognition tasks.</span></p>
<p><span style="color: #000000;"><strong><span style="font-size: x-large;">Publications:</span></strong></span><br />
<span style="color: #000000;">Zhou, Tian., &amp; Wachs, Juan. </span><a href="https://drive.google.com/file/d/0B1tBbdlHcKS-RWhWX2gzNkQ0Q0k/view?usp=sharing" target="_blank">Finding a Needle in a Haystack: Recognizing Surgical Instruments through Vision and Manipulation</a><span style="color: #2a2a2a;">. <span style="color: #000000;">In </span></span><span style="color: #000000;"><em>SPIE/IS&amp;T Electronic Imaging, </em>no. 9, pp. 37–45, IS&amp;T,</span><span style="color: #2a2a2a;"><span style="color: #000000;"> 2017.</span> </span><em><span style="color: #ff0000;">Best Student Paper<br />
</span></em><br />
<span style="color: #000000;">Zhou, Tian, and Juan P. Wachs.</span> &#8220;<a href="https://www.sciencedirect.com/science/article/pii/S0921889016305310">Needle in a haystack: Interactive surgical instrument recognition through perception and manipulation</a>.&#8221; <span style="color: #000000;"><i>Robotics and Autonomous Systems</i> 97 (2017): 182-192.</span></p>
<p>&nbsp;</p>
<p><span style="color: #000000;"><strong><span style="font-size: x-large;">Videos:</span></strong></span></p>
<p><iframe src="//www.youtube.com/embed/FDCAuYoUP_s" width="555" height="456"></iframe></p>
]]></content:encoded>
			<wfw:commentRss>https://Engineering.Purdue.Edu/isat/surgical-instrument-recognition-via-vision-and-manipulation/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>Cabrera wins Outstanding Research Award from College of Engineering</title>
		<link>https://Engineering.Purdue.Edu/isat/cabrera-wins-outstanding-research-award-from-college-of-engineering/</link>
		<comments>https://Engineering.Purdue.Edu/isat/cabrera-wins-outstanding-research-award-from-college-of-engineering/#comments</comments>
		<pubDate>Tue, 24 Apr 2018 23:23:52 +0000</pubDate>
		<dc:creator><![CDATA[ISAT]]></dc:creator>
				<category><![CDATA[Awards]]></category>
		<category><![CDATA[News]]></category>

		<guid isPermaLink="false">https://Engineering.Purdue.Edu/isat/?p=319</guid>
		<description><![CDATA[One of our lab members, Maru Cabrera, was awarded the 2018 College of Engineering Graduate Outstanding Research Award. Students are nominated for the award by the faculty based on numerous factors, including peer-reviewed publications, awards received for presentations or publications, potential contribution to the field and society, leadership in a…<p class="continue-reading-button"> <a class="continue-reading-link" href="https://Engineering.Purdue.Edu/isat/cabrera-wins-outstanding-research-award-from-college-of-engineering/">Continue reading<i class="crycon-right-dir"></i></a></p>]]></description>
				<content:encoded><![CDATA[<p style="text-align: justify;">One of our lab members, Maru Cabrera, was awarded the 2018 College of Engineering Graduate Outstanding Research Award.</p>
<p style="text-align: justify;">Students are nominated for the award by the faculty based on numerous factors, including peer-reviewed publications, awards received for presentations or publications, potential contribution to the field and society, leadership in a research group or community, participation in professional societies, and overall academic achievement.</p>
<p style="text-align: justify;">She was recognized at the 2018 College of Engineering Graduate Student Research, Teaching, and Service Awards Luncheon on April 24th.</p>
<p><a href="https://Engineering.Purdue.Edu/isat/wp-content/uploads/2018/04/IMG-20180424-WA0005.jpg"><img class="aligncenter size-full wp-image-320" src="https://Engineering.Purdue.Edu/isat/wp-content/uploads/2018/04/IMG-20180424-WA0005.jpg" alt="IMG-20180424-WA0005" width="1200" height="1600" /></a></p>
]]></content:encoded>
			<wfw:commentRss>https://Engineering.Purdue.Edu/isat/cabrera-wins-outstanding-research-award-from-college-of-engineering/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>Dr. Gregory Hager, JHU, visits ISAT</title>
		<link>https://Engineering.Purdue.Edu/isat/dr-gregory-hager-jhu-visits-isat/</link>
		<comments>https://Engineering.Purdue.Edu/isat/dr-gregory-hager-jhu-visits-isat/#comments</comments>
		<pubDate>Fri, 06 Apr 2018 23:18:16 +0000</pubDate>
		<dc:creator><![CDATA[ISAT]]></dc:creator>
				<category><![CDATA[Lab Visits]]></category>

		<guid isPermaLink="false">https://Engineering.Purdue.Edu/isat/?p=303</guid>
		<description><![CDATA[Dr. Juan P. Wachs and members of ISAT laboratory welcomed Dr. Gregory Hager, from John Hopkins University, during his visit to Purdue University on April 4th, 2018. Glebys Gonzales and Natalia Sanchez Tamayo demonstrated ABB YUMI, with 3D-printed surgical gripper, performed suture needle manipulation. Maria Cabrera demonstrated the telementoring system through Augmented Reality on the…<p class="continue-reading-button"> <a class="continue-reading-link" href="https://Engineering.Purdue.Edu/isat/dr-gregory-hager-jhu-visits-isat/">Continue reading<i class="crycon-right-dir"></i></a></p>]]></description>
				<content:encoded><![CDATA[<p>Dr. Juan P. Wachs and members of ISAT laboratory welcomed Dr. Gregory Hager, from John Hopkins University, during his visit to Purdue University on April 4th, 2018.</p>
<p><a href="https://Engineering.Purdue.Edu/isat/wp-content/uploads/2018/04/20180404_140406.jpg"><img class="aligncenter wp-image-306 size-large" src="https://Engineering.Purdue.Edu/isat/wp-content/uploads/2018/04/20180404_140406-1024x768.jpg" alt="GregHager_Yumidemo" width="900" height="675" /></a></p>
<p>Glebys Gonzales and Natalia Sanchez Tamayo demonstrated ABB YUMI, with 3D-printed surgical gripper, performed suture needle manipulation.</p>
<p><a href="https://Engineering.Purdue.Edu/isat/wp-content/uploads/2018/04/IMG_20180404_140858.jpg"><img class="aligncenter wp-image-307 size-large" src="https://Engineering.Purdue.Edu/isat/wp-content/uploads/2018/04/IMG_20180404_140858-768x1024.jpg" alt="GregHager_STARdemo" width="768" height="1024" /></a></p>
<p>Maria Cabrera demonstrated the telementoring system through Augmented Reality on the Hololens.</p>
]]></content:encoded>
			<wfw:commentRss>https://Engineering.Purdue.Edu/isat/dr-gregory-hager-jhu-visits-isat/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>ISAT Lab @ IEEE VR&#8217;18</title>
		<link>https://Engineering.Purdue.Edu/isat/isat-lab-ieee-vr18/</link>
		<comments>https://Engineering.Purdue.Edu/isat/isat-lab-ieee-vr18/#comments</comments>
		<pubDate>Sat, 24 Mar 2018 13:00:25 +0000</pubDate>
		<dc:creator><![CDATA[ISAT]]></dc:creator>
				<category><![CDATA[Conferences]]></category>
		<category><![CDATA[News]]></category>

		<guid isPermaLink="false">https://Engineering.Purdue.Edu/isat/?p=372</guid>
		<description><![CDATA[ISAT member Edgar Rojas co-presented an oral presentation on March 18, 2018 at the VAR4Good workshop at IEEE VR 2018 conference, held in Reutlingen, Germany. Rojas gave a talk regarding the current state of the System for Telementoring with Augmented Reality (STAR). The talk included a comparison between two different…<p class="continue-reading-button"> <a class="continue-reading-link" href="https://Engineering.Purdue.Edu/isat/isat-lab-ieee-vr18/">Continue reading<i class="crycon-right-dir"></i></a></p>]]></description>
				<content:encoded><![CDATA[<p>ISAT member Edgar Rojas co-presented an oral presentation on March 18, 2018 at the VAR4Good workshop at IEEE VR 2018 conference, held in Reutlingen, Germany.</p>
<p>Rojas gave a talk regarding the current state of the <span style="color: #008080;"><a style="color: #008080;" href="https://engineering.purdue.edu/starproj/" target="_blank">System for Telementoring with Augmented Reality</a></span> (STAR). The talk included a comparison between two different STAR versions: a Tablet-based and a Head-Mounted Display.</p>
<p>A poster summarizing the talk can be found <span style="color: #008080;"><a style="color: #008080;" href="https://drive.google.com/open?id=19CCooj-x_wlNO2ADtBO3YDrMG99IZQgx" target="_blank">here</a></span>.</p>
<p><a href="https://Engineering.Purdue.Edu/isat/wp-content/uploads/2018/04/conference1.jpg"><img class="aligncenter size-full wp-image-378" src="https://Engineering.Purdue.Edu/isat/wp-content/uploads/2018/04/conference1.jpg" alt="conference" width="720" height="582" /></a></p>
]]></content:encoded>
			<wfw:commentRss>https://Engineering.Purdue.Edu/isat/isat-lab-ieee-vr18/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>Prof. Allison Okamura, Stanford, visits Purdue</title>
		<link>https://Engineering.Purdue.Edu/isat/prof-allison-okamura-stanford-visits-purdue/</link>
		<comments>https://Engineering.Purdue.Edu/isat/prof-allison-okamura-stanford-visits-purdue/#comments</comments>
		<pubDate>Fri, 23 Mar 2018 18:37:04 +0000</pubDate>
		<dc:creator><![CDATA[ISAT]]></dc:creator>
				<category><![CDATA[Lab Visits]]></category>

		<guid isPermaLink="false">https://Engineering.Purdue.Edu/isat/?p=323</guid>
		<description><![CDATA[Dr. Allison Okamura visited Purdue University as a guest presenter for the Purdue Robotics Accelerator initiative. Her seminar was on March 23rd, 2018. Prof. Okamura&#8217;s research focuses on developing the principles and tools needed to realize advanced robotic and human-machine systems capable of haptic (touch) interaction, particularly for biomedical applications.…<p class="continue-reading-button"> <a class="continue-reading-link" href="https://Engineering.Purdue.Edu/isat/prof-allison-okamura-stanford-visits-purdue/">Continue reading<i class="crycon-right-dir"></i></a></p>]]></description>
				<content:encoded><![CDATA[<p style="text-align: justify;">Dr. Allison Okamura visited Purdue University as a guest presenter for the Purdue Robotics Accelerator initiative. Her seminar was on March 23rd, 2018.</p>
<p style="text-align: justify;">Prof. Okamura&#8217;s research focuses on developing the principles and tools needed to realize advanced robotic and human-machine systems capable of haptic (touch) interaction, particularly for biomedical applications. Haptic systems are designed and studied using both analytical and experimental approaches.</p>
<p style="text-align: justify;">As a part of her visit, Prof. Okamura visited the ISAT lab where graduate students presented some of the research projects currently active.</p>
<p style="text-align: justify;">Tian Zhou, a PhD Candidate, showed his work involving the WAM Robot picking and delivering surgical instruments and his research regarding turn-taking in human-robot collaboration.</p>
<p><a href="https://Engineering.Purdue.Edu/isat/wp-content/uploads/2018/04/20180323_093701.jpg"><img class="aligncenter size-large wp-image-324" src="https://Engineering.Purdue.Edu/isat/wp-content/uploads/2018/04/20180323_093701-1024x768.jpg" alt="20180323_093701" width="900" height="675" /></a></p>
<p>&nbsp;</p>
<p>&nbsp;</p>
<p style="text-align: justify;">Natalia Sanchez and Glebys Gonzalez presented the collaborative robot YuMi (ABB) which has been adapted through rapid prototyping, with extended grippers for research in robotic assisted surgery. Additionally, Prof. Okamura tested the HTC Vive virtual reality headset and controllers.</p>
<p><a href="https://Engineering.Purdue.Edu/isat/wp-content/uploads/2018/04/20180323_094501.jpg"><img class="aligncenter size-large wp-image-325" src="https://Engineering.Purdue.Edu/isat/wp-content/uploads/2018/04/20180323_094501-1024x768.jpg" alt="20180323_094501" width="900" height="675" /></a></p>
<p>&nbsp;</p>
<p>&nbsp;</p>
<p>To finalize the visit, Prof. Okamura participated as an impromptu trainee to showcase the functionality of the STAR telementoring system, wearing the HoloLens (Microsoft), through which annotations are overlayed using augmented reality in the user&#8217;s field of view.</p>
<div style="width: 900px; " class="wp-video"><!--[if lt IE 9]><script>document.createElement('video');</script><![endif]-->
<video class="wp-video-shortcode" id="video-323-1" width="900" height="506" preload="none" controls="controls"><source type="video/mp4" src="https://Engineering.Purdue.Edu/isat/wp-content/uploads/2018/04/VID_20180323_095511.mp4?_=1" /><a href="https://Engineering.Purdue.Edu/isat/wp-content/uploads/2018/04/VID_20180323_095511.mp4">https://Engineering.Purdue.Edu/isat/wp-content/uploads/2018/04/VID_20180323_095511.mp4</a></video></div>
]]></content:encoded>
			<wfw:commentRss>https://Engineering.Purdue.Edu/isat/prof-allison-okamura-stanford-visits-purdue/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
<enclosure url="https://Engineering.Purdue.Edu/isat/wp-content/uploads/2018/04/VID_20180323_095511.mp4" length="27711866" type="video/mp4" />
		</item>
		<item>
		<title>ISAT members presented 3 posters at HRI 2018 Chicago</title>
		<link>https://Engineering.Purdue.Edu/isat/isat-members-presented-3-posters-at-hri-2018-chicago/</link>
		<comments>https://Engineering.Purdue.Edu/isat/isat-members-presented-3-posters-at-hri-2018-chicago/#comments</comments>
		<pubDate>Sat, 10 Mar 2018 19:35:43 +0000</pubDate>
		<dc:creator><![CDATA[ISAT]]></dc:creator>
				<category><![CDATA[Conferences]]></category>
		<category><![CDATA[News]]></category>

		<guid isPermaLink="false">https://Engineering.Purdue.Edu/isat/?p=277</guid>
		<description><![CDATA[ISAT members Tian Zhou, Maru Cabrera, and Natalia Sanchez presented their posters at HRI 2018 conference at Chicago on March 6, 2018. Check it out!   &#160; Zhou, T., Cha, J. S., Gonzalez, G. T., Wachs, J. P., Sundaram, C., &#38; Yu, D. (2018, March). Joint Surgeon Attributes Estimation in Robot-Assisted…<p class="continue-reading-button"> <a class="continue-reading-link" href="https://Engineering.Purdue.Edu/isat/isat-members-presented-3-posters-at-hri-2018-chicago/">Continue reading<i class="crycon-right-dir"></i></a></p>]]></description>
				<content:encoded><![CDATA[<p style="text-align: left;"><span style="font-weight: 400;">ISAT members Tian Zhou, Maru Cabrera, and Natalia Sanchez presented their posters at HRI 2018 conference at Chicago on March 6, 2018. Check it out!  </span></p>
<p><a href="https://Engineering.Purdue.Edu/isat/wp-content/uploads/2018/04/IMG_20180306_103553_1-e1524770541891.jpg"><img class=" wp-image-278 alignleft" src="https://Engineering.Purdue.Edu/isat/wp-content/uploads/2018/04/IMG_20180306_103553_1-e1524770541891.jpg" alt="Tian@HRI2018" width="300" height="400" /></a></p>
<p>&nbsp;</p>
<p><span style="font-weight: 400;">Zhou, T., Cha, J. S., Gonzalez, G. T., Wachs, J. P., Sundaram, C., &amp; Yu, D. (2018, March). Joint Surgeon Attributes Estimation in Robot-Assisted Surgery. In </span><i><span style="font-weight: 400;">Companion of the 2018 ACM/IEEE International Conference on Human-Robot Interaction</span></i><span style="font-weight: 400;"> (pp. 285-286). ACM.</span></p>
<p><a href="https://dl.acm.org/citation.cfm?id=3176981">Proceeding</a><br />
<a href="https://drive.google.com/file/d/1XWLjQI3eLayI2HpSZIvh42NVRfUk4OXI/view?usp=sharing" target="_blank">Poster</a></p>
<p>&nbsp;</p>
<p>&nbsp;</p>
<p>&nbsp;<br />
<a href="https://Engineering.Purdue.Edu/isat/wp-content/uploads/2018/04/IMG_20180306_104216-e1524770620721.jpg"><img class="size-full wp-image-280 alignleft" src="https://Engineering.Purdue.Edu/isat/wp-content/uploads/2018/04/IMG_20180306_104216-e1524770620721.jpg" alt="Maru@HRI2018" width="300" height="400" /></a></p>
<p>&nbsp;</p>
<p><span style="font-weight: 400;">Cabrera, M. E., Voyles, R. M., &amp; Wachs, J. P. (2018, March). Coherence in One-Shot Gesture Recognition for Human-Robot Interaction. In </span><i><span style="font-weight: 400;">Companion of the 2018 ACM/IEEE International Conference on Human-Robot Interaction</span></i><span style="font-weight: 400;"> (pp. 75-76). ACM.</span></p>
<p><a href="https://dl.acm.org/citation.cfm?id=3176977">Proceeding</a><br />
Poster</p>
<p>&nbsp;</p>
<p>&nbsp;</p>
<p>&nbsp;</p>
<p>&nbsp;</p>
<p>&nbsp;</p>
<p>&nbsp;</p>
<p><img class="size-full wp-image-279 alignleft" src="https://Engineering.Purdue.Edu/isat/wp-content/uploads/2018/04/IMG_20180306_104100-e1524770669953.jpg" alt="Natalia@HRI2018" width="300" height="400" /></p>
<p>&nbsp;</p>
<p><span style="font-weight: 400;">Sanchez-Tamayo, N., &amp; Wachs, J. P. (2018, March). Collaborative Robots in Surgical Research: a Low-Cost Adaptation. In </span><i><span style="font-weight: 400;">Companion of the 2018 ACM/IEEE International Conference on Human-Robot Interaction</span></i><span style="font-weight: 400;"> (pp. 231-232). ACM.</span></p>
<p><a href="https://dl.acm.org/citation.cfm?id=3176978">Proceeding</a><br />
<a href="https://drive.google.com/file/d/1jK1mvIXVIoThXvQXKWxVprX8Q-gyHOUq/view?usp=sharing">Poster</a></p>
]]></content:encoded>
			<wfw:commentRss>https://Engineering.Purdue.Edu/isat/isat-members-presented-3-posters-at-hri-2018-chicago/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>Zhang team wins two student entrepreneur competitions</title>
		<link>https://Engineering.Purdue.Edu/isat/zhang-team-wins-two-student-entrepreneur-competitions/</link>
		<comments>https://Engineering.Purdue.Edu/isat/zhang-team-wins-two-student-entrepreneur-competitions/#comments</comments>
		<pubDate>Sat, 10 Mar 2018 03:52:51 +0000</pubDate>
		<dc:creator><![CDATA[ISAT]]></dc:creator>
				<category><![CDATA[Awards]]></category>

		<guid isPermaLink="false">https://Engineering.Purdue.Edu/isat/?p=343</guid>
		<description><![CDATA[In February, Ting Zhang, with Shruthi Suresh (Co-advised by Dr. Duerstock), won second place in the social track of the Burton D. Morgan Business Model Competition (BMC). Then on March 6, they won first place in the WomenIN Tech Pitch Competition, a Purdue Foundry initiative which showcases and helps fund early-stage, women-led technology startups in Indiana. A…<p class="continue-reading-button"> <a class="continue-reading-link" href="https://Engineering.Purdue.Edu/isat/zhang-team-wins-two-student-entrepreneur-competitions/">Continue reading<i class="crycon-right-dir"></i></a></p>]]></description>
				<content:encoded><![CDATA[<p>In February, Ting Zhang, with Shruthi Suresh (Co-advised by Dr. Duerstock), won second place in the social track of the <a href="https://www.purdue.edu/discoverypark/bdmce/competitions/" target="_blank">Burton D. Morgan Business Model Competition</a> (BMC). Then on March 6, they won first place in the <a href="http://www.purduewomenin.org/womenin-tech-pitch-competition.html" target="_blank">WomenIN Tech Pitch Competition</a>, a <a href="https://purduefoundry.com/" target="_blank">Purdue Foundry</a> initiative which showcases and helps fund early-stage, women-led technology startups in Indiana.</p>
<p><a href="https://Engineering.Purdue.Edu/isat/wp-content/uploads/2018/04/5D4_2387.jpg"><img class="aligncenter size-large wp-image-344" src="https://Engineering.Purdue.Edu/isat/wp-content/uploads/2018/04/5D4_2387-1024x683.jpg" alt="5D4_2387" width="450" height="300" /></a></p>
<p><a href="https://Engineering.Purdue.Edu/isat/wp-content/uploads/2018/04/IMG_2347.jpg"><img class="aligncenter size-large wp-image-345" src="https://Engineering.Purdue.Edu/isat/wp-content/uploads/2018/04/IMG_2347-1024x784.jpg" alt="IMG_2347" width="450" height="345" /></a></p>
<p>A detailed post can be found here: <a href="https://engineering.purdue.edu/IE/news/2018/zhang-2nd-place">https://engineering.purdue.edu/IE/news/2018/zhang-2nd-place</a></p>
<p>&nbsp;</p>
]]></content:encoded>
			<wfw:commentRss>https://Engineering.Purdue.Edu/isat/zhang-team-wins-two-student-entrepreneur-competitions/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
	</channel>
</rss>
