{"id":20,"date":"2022-07-21T13:37:06","date_gmt":"2022-07-21T10:37:06","guid":{"rendered":"https:\/\/robotics.pme.duth.gr\/workshop_active2\/?page_id=20"},"modified":"2023-06-02T12:01:21","modified_gmt":"2023-06-02T09:01:21","slug":"schedule","status":"publish","type":"page","link":"https:\/\/robotics.pme.duth.gr\/workshop_active2\/?page_id=20","title":{"rendered":"Programme"},"content":{"rendered":"\n<h4 class=\"wp-block-heading\">Speakers and tentative schedule<br>Date: June 2, 2023<br>Start: 2:00 pm<br>End: 6:00 pm<\/h4>\n\n\n\n<ul class=\"wp-block-list\">\n<li>[02:00 \u2013 02:15] <strong>Opening remarks<\/strong>\n<ul class=\"wp-block-list\">\n<li>Organisers<\/li>\n<\/ul>\n<\/li>\n<\/ul>\n\n\n\n<div class=\"wp-block-columns is-layout-flex wp-container-core-columns-is-layout-9d6595d7 wp-block-columns-is-layout-flex\">\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\" style=\"flex-basis:66.66%\">\n<ul class=\"wp-block-list\">\n<li>[02:15 \u2013 02:35] <strong>Invited talk 1<\/strong>  \n\n\n\n\n\n\n\n\n<ul class=\"wp-block-list\">\n<li>Speaker: Katerina Fragiadaki, Assistant Professor<\/li>\n\n\n\n<li>Bio: Katerina Fragkiadaki is an Assistant Professor in the Machine Learning Department at Carnegie Mellon University. She received her Ph.D. from the University of Pennsylvania and was a postdoctoral fellow at UC Berkeley and Google research after that.&nbsp; Her work is on learning visual representations with little supervision and combining spatial reasoning in deep visual learning.&nbsp; Her group develops algorithms for mobile computer vision,&nbsp; learning of physics, and common sense for agents that move around and interact with the world.&nbsp; Her work has been awarded a best Ph.D. thesis award,&nbsp; an NSF CAREER award, an AFOSR Young Investigator award, a DARPA Young Investigator award, Google, TRI, Amazon, UPMC, and Sony faculty research awards.<\/li>\n<\/ul>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Tentative title: Active Vision for Robot Manipulation and Scene Re-arrangement<\/li>\n<\/ul>\n<\/li>\n<\/ul>\n<\/div>\n\n\n\n<div class=\"wp-block-column is-vertically-aligned-center is-layout-flow wp-block-column-is-layout-flow\" style=\"flex-basis:33.33%\"><div class=\"wp-block-image is-style-rounded\">\n<figure class=\"aligncenter size-full is-resized\"><img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/robotics.pme.duth.gr\/workshop_active2\/wp-content\/uploads\/2022\/09\/kfragki2-1.png\" alt=\"\" class=\"wp-image-337\" width=\"177\" height=\"177\" srcset=\"https:\/\/robotics.pme.duth.gr\/workshop_active2\/wp-content\/uploads\/2022\/09\/kfragki2-1.png 358w, https:\/\/robotics.pme.duth.gr\/workshop_active2\/wp-content\/uploads\/2022\/09\/kfragki2-1-300x300.png 300w, https:\/\/robotics.pme.duth.gr\/workshop_active2\/wp-content\/uploads\/2022\/09\/kfragki2-1-150x150.png 150w\" sizes=\"auto, (max-width: 177px) 100vw, 177px\" \/><\/figure>\n<\/div><\/div>\n<\/div>\n\n\n\n<div class=\"wp-block-columns are-vertically-aligned-center is-layout-flex wp-container-core-columns-is-layout-9d6595d7 wp-block-columns-is-layout-flex\">\n<div class=\"wp-block-column is-vertically-aligned-center is-layout-flow wp-block-column-is-layout-flow\" style=\"flex-basis:66.66%\">\n<ul class=\"wp-block-list\">\n<li>[02:35 \u2013 02:55] <strong>Invited talk 2<\/strong>\n<ul class=\"wp-block-list\">\n<li>Speaker: John Tsotsos, Professor at York University, Canada<\/li>\n\n\n\n<li>Bio: John Tsotsos is a Distinguished Research Professor of vision science at York University, Canada. After a Post-Doctoral Fellowship in cardiology at Toronto General Hospital, he joined the University of Toronto in the Faculty of Computer Science and Medicine. In 1980, he founded the Computer Vision Group at the University of Toronto, which he led for 20 years. He was recruited to York University, in 2000, as the Director of the Centre for Vision Research. His current research interest includes the comprehensive theory of visual attention in humans. A practical outlet for this theory embodies elements of the theory into the vision systems of mobile robots.<\/li>\n\n\n\n<li>Tentative title: Active visual sampling by aligning sensor and scene geometry<\/li>\n<\/ul>\n<\/li>\n<\/ul>\n<\/div>\n\n\n\n<div class=\"wp-block-column is-vertically-aligned-center is-layout-flow wp-block-column-is-layout-flow\" style=\"flex-basis:33.33%\"><div class=\"wp-block-image is-style-rounded\">\n<figure class=\"aligncenter size-full is-resized\"><img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/robotics.pme.duth.gr\/workshop_active2\/wp-content\/uploads\/2022\/09\/john-tsotsos_2.png\" alt=\"\" class=\"wp-image-347\" width=\"177\" height=\"177\" srcset=\"https:\/\/robotics.pme.duth.gr\/workshop_active2\/wp-content\/uploads\/2022\/09\/john-tsotsos_2.png 480w, https:\/\/robotics.pme.duth.gr\/workshop_active2\/wp-content\/uploads\/2022\/09\/john-tsotsos_2-300x300.png 300w, https:\/\/robotics.pme.duth.gr\/workshop_active2\/wp-content\/uploads\/2022\/09\/john-tsotsos_2-150x150.png 150w\" sizes=\"auto, (max-width: 177px) 100vw, 177px\" \/><\/figure>\n<\/div><\/div>\n<\/div>\n\n\n\n<div class=\"wp-block-columns are-vertically-aligned-center is-layout-flex wp-container-core-columns-is-layout-9d6595d7 wp-block-columns-is-layout-flex\">\n<div class=\"wp-block-column is-vertically-aligned-center is-layout-flow wp-block-column-is-layout-flow\" style=\"flex-basis:66.66%\">\n<ul class=\"wp-block-list\">\n<li>[02:55 \u2013 03:15] <strong>Invited talk 3<\/strong>\n<ul class=\"wp-block-list\">\n<li>Speaker: Yulia Sandamirskaya, Senior Researcher<\/li>\n\n\n\n<li>Bio: Yulia Sandamirskaya leads the Applications Research team of the Neuromorphic Computing Lab at Intel. Her team in Munich develops spiking neuronal network-based algorithms for neuromorphic hardware to demonstrate the potential of neuromorphic computing in robotics. She has 15 years of research experience in neural dynamics, embodied cognition, and autonomous robotics.&nbsp; She led a research group \u201cNeuromorphic Cognitive Robots\u201d at the Institute of Neuroinformatics of the University of Zurich and ETH Zurich, Switzerland, and the \u201cAutonomous learning\u201d group at the Institute for Neural Computation at the Ruhr-University Bochum.<\/li>\n\n\n\n<li>Tentative title: Active perception and learning in neuromorphic systems: from events to action plans and back<\/li>\n<\/ul>\n<\/li>\n<\/ul>\n<\/div>\n\n\n\n<div class=\"wp-block-column is-vertically-aligned-center is-layout-flow wp-block-column-is-layout-flow\" style=\"flex-basis:33.33%\"><div class=\"wp-block-image is-style-rounded\">\n<figure class=\"aligncenter size-full is-resized\"><img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/robotics.pme.duth.gr\/workshop_active2\/wp-content\/uploads\/2022\/09\/Yulia-Sandamirskaya.jpg\" alt=\"\" class=\"wp-image-349\" width=\"177\" height=\"177\" srcset=\"https:\/\/robotics.pme.duth.gr\/workshop_active2\/wp-content\/uploads\/2022\/09\/Yulia-Sandamirskaya.jpg 512w, https:\/\/robotics.pme.duth.gr\/workshop_active2\/wp-content\/uploads\/2022\/09\/Yulia-Sandamirskaya-300x300.jpg 300w, https:\/\/robotics.pme.duth.gr\/workshop_active2\/wp-content\/uploads\/2022\/09\/Yulia-Sandamirskaya-150x150.jpg 150w\" sizes=\"auto, (max-width: 177px) 100vw, 177px\" \/><\/figure>\n<\/div><\/div>\n<\/div>\n\n\n\n<div class=\"wp-block-columns are-vertically-aligned-center is-layout-flex wp-container-core-columns-is-layout-9d6595d7 wp-block-columns-is-layout-flex\">\n<div class=\"wp-block-column is-vertically-aligned-center is-layout-flow wp-block-column-is-layout-flow\" style=\"flex-basis:66.66%\">\n<ul class=\"wp-block-list\">\n<li>[03:15 \u2013 03:35] <strong>Invited talk 4<\/strong>    \n\n\n\n\n<ul class=\"wp-block-list\">\n<li>Speaker: Davide Scaramuzza, Professor <\/li>\n\n\n\n<li>Bio: Davide Scaramuzza is a Professor of Robotics and Perception at the University of Zurich, where he does research at the intersection of robotics, computer vision, and machine learning. He did his Ph.D. at ETH Zurich, a postdoc at the University of Pennsylvania, and was visiting professor at Stanford University. His research focuses on autonomous, agile navigation of micro drones using both standard and neuromorphic event-based cameras. He pioneered autonomous, vision-based navigation of drones, which inspired the navigation algorithm of the NASA Mars helicopter. He has been serving as a consultant for the United Nations on topics such as disaster response and disarmament, as well as the Fukushima Action Plan on Nuclear Safety. He won many prestigious awards, such as a European-Research-Council Consolidator grant, the IEEE Robotics and Automation Society Early Career Award, an SNF-ERC Starting Grant, a Google Research Award, a Facebook Distinguished Faculty Research Award, two NASA TechBrief Awards, and many paper awards. In 2015, he co-founded Zurich-Eye, today Facebook Zurich, which developed the world-leading virtual-reality headset, Oculus Quest, which sold over 10 million units. In 2020, he co-founded SUIND, which builds autonomous drones for precision agriculture. Many aspects of his research have been prominently featured in broader media, such as The New York Times, The Economist, Forbes, BBC News, and Discovery Channel.<\/li>\n<\/ul>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Tentative title: Perception-aware planning and control<\/li>\n<\/ul>\n<\/li>\n<\/ul>\n<\/div>\n\n\n\n<div class=\"wp-block-column is-vertically-aligned-center is-layout-flow wp-block-column-is-layout-flow\" style=\"flex-basis:33.33%\"><div class=\"wp-block-image is-style-rounded\">\n<figure class=\"aligncenter size-full is-resized\"><img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/robotics.pme.duth.gr\/workshop_active2\/wp-content\/uploads\/2022\/09\/davide.jpg\" alt=\"\" class=\"wp-image-358\" width=\"177\" height=\"177\" srcset=\"https:\/\/robotics.pme.duth.gr\/workshop_active2\/wp-content\/uploads\/2022\/09\/davide.jpg 900w, https:\/\/robotics.pme.duth.gr\/workshop_active2\/wp-content\/uploads\/2022\/09\/davide-300x300.jpg 300w, https:\/\/robotics.pme.duth.gr\/workshop_active2\/wp-content\/uploads\/2022\/09\/davide-150x150.jpg 150w, https:\/\/robotics.pme.duth.gr\/workshop_active2\/wp-content\/uploads\/2022\/09\/davide-768x771.jpg 768w\" sizes=\"auto, (max-width: 177px) 100vw, 177px\" \/><\/figure>\n<\/div><\/div>\n<\/div>\n\n\n\n<ul class=\"wp-block-list\">\n<li>[03:35 \u2013 03:55] <strong><strong>Paper presenters \/ Coffee<\/strong> break<\/strong><\/li>\n<\/ul>\n\n\n\n<div class=\"wp-block-columns are-vertically-aligned-center is-layout-flex wp-container-core-columns-is-layout-9d6595d7 wp-block-columns-is-layout-flex\">\n<div class=\"wp-block-column is-vertically-aligned-center is-layout-flow wp-block-column-is-layout-flow\" style=\"flex-basis:66.66%\">\n<ul class=\"wp-block-list\">\n<li>[03:55 \u2013 04:15] <strong>Invited talk 5<\/strong>\n<ul class=\"wp-block-list\">\n<li>Speaker: Alexandre Bernardino, Associate Professor <\/li>\n\n\n\n<li>Bio: Alexandre Bernardino is a tenured Associate Professor at the Department of Electrical and Computer Engineering and Senior Researcher at the Computer and Robot Vision Laboratory of the Institute for Systems and Robotics at IST, the engineering faculty of Lisbon University.  His main research interests include applying computer vision, machine learning, cognitive science, and control theory to advanced robotics and automation systems.<\/li>\n\n\n\n<li>Tentative title: Active Semantic Foveal Vision<\/li>\n<\/ul>\n<\/li>\n<\/ul>\n<\/div>\n\n\n\n<div class=\"wp-block-column is-vertically-aligned-center is-layout-flow wp-block-column-is-layout-flow\" style=\"flex-basis:33.33%\"><div class=\"wp-block-image is-style-rounded\">\n<figure class=\"aligncenter size-full is-resized\"><img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/robotics.pme.duth.gr\/workshop_active2\/wp-content\/uploads\/2022\/09\/Alexandre-Bernardino.png\" alt=\"\" class=\"wp-image-340\" width=\"177\" height=\"177\" srcset=\"https:\/\/robotics.pme.duth.gr\/workshop_active2\/wp-content\/uploads\/2022\/09\/Alexandre-Bernardino.png 500w, https:\/\/robotics.pme.duth.gr\/workshop_active2\/wp-content\/uploads\/2022\/09\/Alexandre-Bernardino-300x300.png 300w, https:\/\/robotics.pme.duth.gr\/workshop_active2\/wp-content\/uploads\/2022\/09\/Alexandre-Bernardino-150x150.png 150w\" sizes=\"auto, (max-width: 177px) 100vw, 177px\" \/><\/figure>\n<\/div><\/div>\n<\/div>\n\n\n\n<div class=\"wp-block-columns are-vertically-aligned-center is-layout-flex wp-container-core-columns-is-layout-9d6595d7 wp-block-columns-is-layout-flex\">\n<div class=\"wp-block-column is-vertically-aligned-center is-layout-flow wp-block-column-is-layout-flow\" style=\"flex-basis:66.66%\">\n<ul class=\"wp-block-list\">\n<li>[04:15 \u2013 04:35] <strong>Invited talk 6<\/strong>\n<ul class=\"wp-block-list\">\n<li>Speaker: Michael Milford, Professor <\/li>\n\n\n\n<li>Bio: Professor Michael Milford is a multi-award-winning educational entrepreneur who conducts interdisciplinary research at the boundary between robotics, neuroscience, and computer vision. His research models the neural mechanisms in the brain underlying tasks like navigation and perception to develop new technologies in challenging application domains such as all-weather and anytime positioning for autonomous vehicles. He has led or co-led projects collaborating with leading global organizations, including Amazon, Google, Intel, Ford, Rheinmetall, Air Force Office of Scientific Research, NASA, Harvard, Oxford, and MIT. From 2022 \u2013 2027 he is leading a large research team combining bio-inspired and computer science-based approaches to provide a ubiquitous alternative to GPS that does not rely on satellites. He currently holds the positions of Australian Research Council Laureate Fellow, Joint Director of the QUT Centre for Robotics and QUT Professor of Robotics<\/li>\n\n\n\n<li>Tentative title: Closing the loop on localization: Active navigation for adversity- and adversarial-robustness<\/li>\n<\/ul>\n<\/li>\n<\/ul>\n<\/div>\n\n\n\n<div class=\"wp-block-column is-vertically-aligned-center is-layout-flow wp-block-column-is-layout-flow\" style=\"flex-basis:33.33%\"><div class=\"wp-block-image is-style-rounded\">\n<figure class=\"aligncenter size-full is-resized\"><img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/robotics.pme.duth.gr\/workshop_active2\/wp-content\/uploads\/2022\/09\/Michael_Milford_Research.jpg\" alt=\"\" class=\"wp-image-354\" width=\"177\" height=\"177\" srcset=\"https:\/\/robotics.pme.duth.gr\/workshop_active2\/wp-content\/uploads\/2022\/09\/Michael_Milford_Research.jpg 718w, https:\/\/robotics.pme.duth.gr\/workshop_active2\/wp-content\/uploads\/2022\/09\/Michael_Milford_Research-300x300.jpg 300w, https:\/\/robotics.pme.duth.gr\/workshop_active2\/wp-content\/uploads\/2022\/09\/Michael_Milford_Research-150x150.jpg 150w\" sizes=\"auto, (max-width: 177px) 100vw, 177px\" \/><\/figure>\n<\/div><\/div>\n<\/div>\n\n\n\n<div class=\"wp-block-columns are-vertically-aligned-center is-layout-flex wp-container-core-columns-is-layout-9d6595d7 wp-block-columns-is-layout-flex\">\n<div class=\"wp-block-column is-vertically-aligned-center is-layout-flow wp-block-column-is-layout-flow\" style=\"flex-basis:66.66%\">\n<ul class=\"wp-block-list\">\n<li>[04:35 \u2013 04:55] <strong>Invited talk 7<\/strong>\n<ul class=\"wp-block-list\">\n<li>Speaker: Guido de Croon, Professor <\/li>\n\n\n\n<li>Bio: Received his M.Sc. and Ph.D. in Artificial Intelligence (AI) at Maastricht University, the Netherlands. His research interest lies in computationally efficient, bio-inspired algorithms for robot autonomy, emphasizing computer vision. Since 2008 he has worked on algorithms for achieving autonomous flight with small and lightweight flying robots, such as the DelFly flapping wing MAV. From 2011-2012, he was a research fellow in the Advanced Concepts Team of the European Space Agency, where he studied topics such as optical-flow-based control algorithms for extraterrestrial landing scenarios. After his return to TU Delft, his work has included the fully autonomous flight of a 20-gram DelFly, a new theory on active distance perception with optical flow, and a swarm of tiny drones able to explore unknown environments. More recently, he proposed an explanation for how insects actively determine their orientation with respect to the gravity direction. Currently, he is a Full Professor at TU Delft and scientific lead of the Micro Air Vehicle lab (MAVLab) of the Delft University of Technology.<\/li>\n\n\n\n<li>Tentative title: Weird cases of active vision in optical flow control<\/li>\n<\/ul>\n<\/li>\n<\/ul>\n<\/div>\n\n\n\n<div class=\"wp-block-column is-vertically-aligned-center is-layout-flow wp-block-column-is-layout-flow\" style=\"flex-basis:33.33%\"><div class=\"wp-block-image is-style-rounded\">\n<figure class=\"aligncenter size-full is-resized\"><img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/robotics.pme.duth.gr\/workshop_active2\/wp-content\/uploads\/2022\/09\/decroon.jpg\" alt=\"\" class=\"wp-image-343\" width=\"177\" height=\"177\" srcset=\"https:\/\/robotics.pme.duth.gr\/workshop_active2\/wp-content\/uploads\/2022\/09\/decroon.jpg 301w, https:\/\/robotics.pme.duth.gr\/workshop_active2\/wp-content\/uploads\/2022\/09\/decroon-150x150.jpg 150w\" sizes=\"auto, (max-width: 177px) 100vw, 177px\" \/><\/figure>\n<\/div><\/div>\n<\/div>\n\n\n\n<div class=\"wp-block-columns are-vertically-aligned-center is-layout-flex wp-container-core-columns-is-layout-9d6595d7 wp-block-columns-is-layout-flex\">\n<div class=\"wp-block-column is-vertically-aligned-center is-layout-flow wp-block-column-is-layout-flow\" style=\"flex-basis:66.66%\">\n<ul class=\"wp-block-list\">\n<li>[04:55 \u2013 05:15] <strong>Invited talk 8<\/strong>\n<ul class=\"wp-block-list\">\n<li>Speaker: Kostas Alexis, Professor <\/li>\n\n\n\n<li>Bio: Kostas Alexis is a Full Professor at the Department of Engineering Cybernetics of the Norwegian University of Science and Technology (NTNU). Highlights of his research include leading Team CERBERUS winning the DAPRA Subterranean Challenge and a host of contributions in the domain of resilient robotic autonomy \u2013 in perception, planning, and control, including learned navigation policies. Earlier research has included contributions in setting the endurance world record for UAVs below 50kg class with AtlantikSolar flying continuously for 81.5 hours. Since becoming a professor, initially in the US and then in Norway, he has been the PI for a host of grants from NSF, DARPA, NASA, DOE, USDA, Horizon Europe, the Research Council of Norway, and other public and private sources.<\/li>\n\n\n\n<li>Tentative title: Combined deep collision prediction and informative navigation for aerial Robots<\/li>\n<\/ul>\n<\/li>\n<\/ul>\n<\/div>\n\n\n\n<div class=\"wp-block-column is-vertically-aligned-center is-layout-flow wp-block-column-is-layout-flow\" style=\"flex-basis:33.33%\"><div class=\"wp-block-image is-style-rounded\">\n<figure class=\"aligncenter size-full is-resized\"><img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/robotics.pme.duth.gr\/workshop_active2\/wp-content\/uploads\/2022\/09\/alexis.jpg\" alt=\"\" class=\"wp-image-357\" width=\"177\" height=\"177\" srcset=\"https:\/\/robotics.pme.duth.gr\/workshop_active2\/wp-content\/uploads\/2022\/09\/alexis.jpg 900w, https:\/\/robotics.pme.duth.gr\/workshop_active2\/wp-content\/uploads\/2022\/09\/alexis-300x300.jpg 300w, https:\/\/robotics.pme.duth.gr\/workshop_active2\/wp-content\/uploads\/2022\/09\/alexis-150x150.jpg 150w, https:\/\/robotics.pme.duth.gr\/workshop_active2\/wp-content\/uploads\/2022\/09\/alexis-768x768.jpg 768w\" sizes=\"auto, (max-width: 177px) 100vw, 177px\" \/><\/figure>\n<\/div><\/div>\n<\/div>\n\n\n\n<ul class=\"wp-block-list\">\n<li>[05:15 \u2013 05:45] <strong>Panel discussion<\/strong><\/li>\n<\/ul>\n\n\n\n<ul class=\"wp-block-list\">\n<li>[05:45 \u2013 06:00] <strong>Closing remarks<\/strong><\/li>\n<\/ul>\n\n\n<div class=\"wp-block-image\">\n<figure class=\"aligncenter size-full\"><a href=\"https:\/\/www.icra2023.org\/\" target=\"_blank\" rel=\"noreferrer noopener\"><img loading=\"lazy\" decoding=\"async\" width=\"482\" height=\"216\" src=\"https:\/\/robotics.pme.duth.gr\/workshop_active2\/wp-content\/uploads\/2022\/09\/image.png\" alt=\"\" class=\"wp-image-162\" srcset=\"https:\/\/robotics.pme.duth.gr\/workshop_active2\/wp-content\/uploads\/2022\/09\/image.png 482w, https:\/\/robotics.pme.duth.gr\/workshop_active2\/wp-content\/uploads\/2022\/09\/image-300x134.png 300w\" sizes=\"auto, (max-width: 482px) 100vw, 482px\" \/><\/a><\/figure>\n<\/div>\n\n\n<ul class=\"wp-block-social-links aligncenter is-layout-flex wp-block-social-links-is-layout-flex\"><li class=\"wp-social-link wp-social-link-facebook  wp-block-social-link\"><a href=\"https:\/\/www.facebook.com\/activevisionWS\" class=\"wp-block-social-link-anchor\"><svg width=\"24\" height=\"24\" viewBox=\"0 0 24 24\" version=\"1.1\" xmlns=\"http:\/\/www.w3.org\/2000\/svg\" aria-hidden=\"true\" focusable=\"false\"><path d=\"M12 2C6.5 2 2 6.5 2 12c0 5 3.7 9.1 8.4 9.9v-7H7.9V12h2.5V9.8c0-2.5 1.5-3.9 3.8-3.9 1.1 0 2.2.2 2.2.2v2.5h-1.3c-1.2 0-1.6.8-1.6 1.6V12h2.8l-.4 2.9h-2.3v7C18.3 21.1 22 17 22 12c0-5.5-4.5-10-10-10z\"><\/path><\/svg><span class=\"wp-block-social-link-label screen-reader-text\">Facebook<\/span><\/a><\/li>\n\n<li class=\"wp-social-link wp-social-link-instagram  wp-block-social-link\"><a href=\"https:\/\/www.instagram.com\/activevision.workshop\/\" class=\"wp-block-social-link-anchor\"><svg width=\"24\" height=\"24\" viewBox=\"0 0 24 24\" version=\"1.1\" xmlns=\"http:\/\/www.w3.org\/2000\/svg\" aria-hidden=\"true\" focusable=\"false\"><path d=\"M12,4.622c2.403,0,2.688,0.009,3.637,0.052c0.877,0.04,1.354,0.187,1.671,0.31c0.42,0.163,0.72,0.358,1.035,0.673 c0.315,0.315,0.51,0.615,0.673,1.035c0.123,0.317,0.27,0.794,0.31,1.671c0.043,0.949,0.052,1.234,0.052,3.637 s-0.009,2.688-0.052,3.637c-0.04,0.877-0.187,1.354-0.31,1.671c-0.163,0.42-0.358,0.72-0.673,1.035 c-0.315,0.315-0.615,0.51-1.035,0.673c-0.317,0.123-0.794,0.27-1.671,0.31c-0.949,0.043-1.233,0.052-3.637,0.052 s-2.688-0.009-3.637-0.052c-0.877-0.04-1.354-0.187-1.671-0.31c-0.42-0.163-0.72-0.358-1.035-0.673 c-0.315-0.315-0.51-0.615-0.673-1.035c-0.123-0.317-0.27-0.794-0.31-1.671C4.631,14.688,4.622,14.403,4.622,12 s0.009-2.688,0.052-3.637c0.04-0.877,0.187-1.354,0.31-1.671c0.163-0.42,0.358-0.72,0.673-1.035 c0.315-0.315,0.615-0.51,1.035-0.673c0.317-0.123,0.794-0.27,1.671-0.31C9.312,4.631,9.597,4.622,12,4.622 M12,3 C9.556,3,9.249,3.01,8.289,3.054C7.331,3.098,6.677,3.25,6.105,3.472C5.513,3.702,5.011,4.01,4.511,4.511 c-0.5,0.5-0.808,1.002-1.038,1.594C3.25,6.677,3.098,7.331,3.054,8.289C3.01,9.249,3,9.556,3,12c0,2.444,0.01,2.751,0.054,3.711 c0.044,0.958,0.196,1.612,0.418,2.185c0.23,0.592,0.538,1.094,1.038,1.594c0.5,0.5,1.002,0.808,1.594,1.038 c0.572,0.222,1.227,0.375,2.185,0.418C9.249,20.99,9.556,21,12,21s2.751-0.01,3.711-0.054c0.958-0.044,1.612-0.196,2.185-0.418 c0.592-0.23,1.094-0.538,1.594-1.038c0.5-0.5,0.808-1.002,1.038-1.594c0.222-0.572,0.375-1.227,0.418-2.185 C20.99,14.751,21,14.444,21,12s-0.01-2.751-0.054-3.711c-0.044-0.958-0.196-1.612-0.418-2.185c-0.23-0.592-0.538-1.094-1.038-1.594 c-0.5-0.5-1.002-0.808-1.594-1.038c-0.572-0.222-1.227-0.375-2.185-0.418C14.751,3.01,14.444,3,12,3L12,3z M12,7.378 c-2.552,0-4.622,2.069-4.622,4.622S9.448,16.622,12,16.622s4.622-2.069,4.622-4.622S14.552,7.378,12,7.378z M12,15 c-1.657,0-3-1.343-3-3s1.343-3,3-3s3,1.343,3,3S13.657,15,12,15z M16.804,6.116c-0.596,0-1.08,0.484-1.08,1.08 s0.484,1.08,1.08,1.08c0.596,0,1.08-0.484,1.08-1.08S17.401,6.116,16.804,6.116z\"><\/path><\/svg><span class=\"wp-block-social-link-label screen-reader-text\">Instagram<\/span><\/a><\/li>\n\n<li class=\"wp-social-link wp-social-link-twitter  wp-block-social-link\"><a href=\"https:\/\/twitter.com\/ActivevisionWS\" class=\"wp-block-social-link-anchor\"><svg width=\"24\" height=\"24\" viewBox=\"0 0 24 24\" version=\"1.1\" xmlns=\"http:\/\/www.w3.org\/2000\/svg\" aria-hidden=\"true\" focusable=\"false\"><path d=\"M22.23,5.924c-0.736,0.326-1.527,0.547-2.357,0.646c0.847-0.508,1.498-1.312,1.804-2.27 c-0.793,0.47-1.671,0.812-2.606,0.996C18.324,4.498,17.257,4,16.077,4c-2.266,0-4.103,1.837-4.103,4.103 c0,0.322,0.036,0.635,0.106,0.935C8.67,8.867,5.647,7.234,3.623,4.751C3.27,5.357,3.067,6.062,3.067,6.814 c0,1.424,0.724,2.679,1.825,3.415c-0.673-0.021-1.305-0.206-1.859-0.513c0,0.017,0,0.034,0,0.052c0,1.988,1.414,3.647,3.292,4.023 c-0.344,0.094-0.707,0.144-1.081,0.144c-0.264,0-0.521-0.026-0.772-0.074c0.522,1.63,2.038,2.816,3.833,2.85 c-1.404,1.1-3.174,1.756-5.096,1.756c-0.331,0-0.658-0.019-0.979-0.057c1.816,1.164,3.973,1.843,6.29,1.843 c7.547,0,11.675-6.252,11.675-11.675c0-0.178-0.004-0.355-0.012-0.531C20.985,7.47,21.68,6.747,22.23,5.924z\"><\/path><\/svg><span class=\"wp-block-social-link-label screen-reader-text\">Twitter<\/span><\/a><\/li>\n\n<li class=\"wp-social-link wp-social-link-youtube  wp-block-social-link\"><a href=\"https:\/\/www.youtube.com\/channel\/UCj2pmtZV8ZJIVq2BUX2xoaQ\" class=\"wp-block-social-link-anchor\"><svg width=\"24\" height=\"24\" viewBox=\"0 0 24 24\" version=\"1.1\" xmlns=\"http:\/\/www.w3.org\/2000\/svg\" aria-hidden=\"true\" focusable=\"false\"><path d=\"M21.8,8.001c0,0-0.195-1.378-0.795-1.985c-0.76-0.797-1.613-0.801-2.004-0.847c-2.799-0.202-6.997-0.202-6.997-0.202 h-0.009c0,0-4.198,0-6.997,0.202C4.608,5.216,3.756,5.22,2.995,6.016C2.395,6.623,2.2,8.001,2.2,8.001S2,9.62,2,11.238v1.517 c0,1.618,0.2,3.237,0.2,3.237s0.195,1.378,0.795,1.985c0.761,0.797,1.76,0.771,2.205,0.855c1.6,0.153,6.8,0.201,6.8,0.201 s4.203-0.006,7.001-0.209c0.391-0.047,1.243-0.051,2.004-0.847c0.6-0.607,0.795-1.985,0.795-1.985s0.2-1.618,0.2-3.237v-1.517 C22,9.62,21.8,8.001,21.8,8.001z M9.935,14.594l-0.001-5.62l5.404,2.82L9.935,14.594z\"><\/path><\/svg><span class=\"wp-block-social-link-label screen-reader-text\">YouTube<\/span><\/a><\/li><\/ul>\n","protected":false},"excerpt":{"rendered":"<p>Speakers and tentative scheduleDate: June 2, 2023Start: 2:00 pmEnd: 6:00 pm<\/p>\n","protected":false},"author":1,"featured_media":0,"parent":0,"menu_order":0,"comment_status":"closed","ping_status":"closed","template":"","meta":{"footnotes":""},"class_list":["post-20","page","type-page","status-publish","hentry"],"_links":{"self":[{"href":"https:\/\/robotics.pme.duth.gr\/workshop_active2\/index.php?rest_route=\/wp\/v2\/pages\/20","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/robotics.pme.duth.gr\/workshop_active2\/index.php?rest_route=\/wp\/v2\/pages"}],"about":[{"href":"https:\/\/robotics.pme.duth.gr\/workshop_active2\/index.php?rest_route=\/wp\/v2\/types\/page"}],"author":[{"embeddable":true,"href":"https:\/\/robotics.pme.duth.gr\/workshop_active2\/index.php?rest_route=\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/robotics.pme.duth.gr\/workshop_active2\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=20"}],"version-history":[{"count":99,"href":"https:\/\/robotics.pme.duth.gr\/workshop_active2\/index.php?rest_route=\/wp\/v2\/pages\/20\/revisions"}],"predecessor-version":[{"id":599,"href":"https:\/\/robotics.pme.duth.gr\/workshop_active2\/index.php?rest_route=\/wp\/v2\/pages\/20\/revisions\/599"}],"wp:attachment":[{"href":"https:\/\/robotics.pme.duth.gr\/workshop_active2\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=20"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}