TUESDAY, NOVEMBER 12, 2019
7:30 am - 9:00 am
Registration and Breakfast
9:00 am - 9:15 am
Welcome
9:15 am - 10:15 am
KEYNOTE: Human Augmentation Robots for the Automation Age: An Extraordinary Category of Cobots
Ben Wolff, Sarcos Robotics
Ben Wolff

Ben Wolff

Chairman, CEO & Director
Sarcos Robotics

Human Augmentation Robots for the Automation Age: An Extraordinary Category of Cobots

Although there are many definitions for cobots or collaborative robots, one commonly accepted definition is a robot that is intended to physically interact with humans in a shared workspace. Another commonly accepted definition is a computer controlled robotic apparatus that assists a human worker, as on an assemblyline, by guiding or redirecting motions initiated by the worker who provides the motive power. With these definitions in mind, wearable robots, or robots that augment human performance such as powered, full-body exoskeletons, are an extraordinary category of collaborative robots that are coming soon to a workplace near you.
The successful pairing of man and machine represented by robots that augment humans has the potential to make our workforce safer and more productive, enabling more people to do more safely than ever before. These machines bring together the best of what humans and machines have to offer -- combining human experience, wisdom, intuition and judgment with the strength, endurance and precision of machines.
The synergy between man and machine will be further enhanced with machine learning and artificial intelligence technologies, ultimately enabling humans to teach their robot partners how to perform complex tasks in complex environments the way we humans do things. The ultimate objective is to deliver the economic and safety benefits typically associated with automation, but for those tasks or environments that are too complex, too random or too diverse for automation to be effective, while at the same time making more physically demanding jobs available to more people than ever before.
Sarcos Robotics has more than 25 years of experience developing biologically-inspired robotic systems. The company’s chairman and CEO Ben Wolff has a unique vantage point to observe how wearable robots, such as powered, full-body exoskeletons, will change the productivity and safety equation for many industries to empower the work force of the future.

10:15 am - 10:45 am
Deep Learning Machine Vision for Beginners - No Knowledge Required
Track: Advanced Vision
Andy Long

Andy Long

CEO
Cyth Systems

Deep Learning Machine Vision for Beginners - No Knowledge Required

This presentation will demonstrate how somebody with no prior experience in machine vision can utilize Artificial Intelligence (AI) for there vision applications. Through simple point-and-click, operators can now "program" the performance of their vision system with ease. Learn how AI can benefit what you're working on, today!

10:15 am - 10:45 am
Opening the Eyes of Robots: How Rapid Motion Planning is Setting Robots Free
Track: Collaborative & Mobile Robotics
George Konidaris

George Konidaris

Founder & Chief
Realtime Robotics

Opening the Eyes of Robots: How Rapid Motion Planning is Setting Robots Free

The role of robots is still relatively limited and has not fundamentally changed in the last two decades. Robots remain behind cages in industrial settings and require a vast amount of time and money to integrate them into the workplace. Similarly, despite the hype around autonomous vehicles, the likelihood of them roaming urban environments seems like a distant reality.

 

Expanding automation is the key to increasing productivity and lowering costs; however, robots today are constrained by safety concerns, and as a result, are not able to operate at an efficient pace. To realize the benefits of the automation age robots must be able to adapt to unstructured environments. This session will shine a spotlight on why rapid motion planning is critical to the wide scale adoption of robots in any environment.

 

Rapid motion planning will transform how machines move by making it possible for robots, autonomous vehicles, and other machines to navigate dynamic environments smoothly, quickly, and intuitively. It enables collaborative robots to work together without the risk of collision at an efficient pace. Robots will now be able to recalculate motion plans in milliseconds allowing manufacturers to expand automation significantly, delivering material productivity gains.

 

The session will showcase how robots can safely collaborate with people without the risk of collision as robots now can recognize and respond to changing environments. This will remove the cost and complexity of caging, guarding, or light curtains which have long been a barrier to wide-scale adoption of robots.

 

Rapid motion planning is changing the face of automation by enabling robots to adapt to dynamic environments in milliseconds. Robots can now work together seamlessly which drastically reduces the complexity of deployment and maintenance. Rapid motion planning is the catalyst that will fuel the mass adoption of robotic solutions.

10:15 am - 10:45 am
Autonomous Robots - Trends and Perspectives from Industry
Track: Artificial Intelligence
Christian Friedrich

Christian Friedrich

Roboticist
SCHUNK

Autonomous Robots - Trends and Perspectives from Industry

The flexibility required in manufacturing systems to produce the variety of products is difficult to achieve with today's automation solutions. Flexibilization of manufacturing technology is only conceivable through the development and integration of autonomous functions in intelligent and distributed component systems. Only the use of system intelligence makes it possible to establish production processes that are difficult to automate. This also allows task-oriented programming which drastically reduces the time-consuming application development by experts.

The presentation shows the basic technologies for building autonomous robotic systems and discusses them for the use in production environments. An overview of modern and new algorithms for perception, planning and control in the field of artificial intelligence will be given and how they can be transferred to revolutionize manufacturing automation. Further, a modular control architecture is shown, which allows the deployment of the different algorithms and processes. To enable this, we will provide a distributed control architecture which works on different field control layers. In addition, current application areas will be presented. The focus is on the topics of autonomous grasping and autonomous assembly. It will be explained how instructions for robot systems are created based on sensor and production data and how complex handling and assembly processes can be flexibly automated without the need of a human expert.

10:45 am - 11:00 am
Break & Exhibit Time
11:00 am - 11:30 am
Industry 4.0 - Best Practices for Cloud Integration
Track: Advanced Vision
Darcy Bachert

Darcy Bachert

CEO
Prolucid

Industry 4.0 - Best Practices for Cloud Integration

Integration of cloud technology is becoming a critical component for manufacturers, as it can help transform business models, improve processes, and enable continuous improvement through integration ofdata with analytics and machine learning.That being said there are many potential challenges with cloud integration, including understanding the business costs associated, planning for security and regulatory considerations, and technology selection and implementation to enable a system that can scale and stay current over time.

 

This presentation will share information about some of the real-world examples for adopting cloud technology and advanced analytics/machine learning into manufacturing operations, along with the challenges that will exist, and best practices and approach for tackling them.

 

We answer these questions about IoT connectivity:

 
  1. What are some of the benefits of cloud integration?
  2. What are some of the costs and risks of being connected?
  3. How is cloud integration being used in real-world applications?
  4. How should we approach security and data protection?
  5. What are some of the important design considerations?
11:00 am - 11:30 am
Dull, Dirty, or Dangerous: How to Build Cobots that Help Humans
Track: Collaborative Robots
Pablo Molina

Pablo Molina

CTO and co-founder
Avidbots

Dull, Dirty, or Dangerous: How to Build Cobots that Help Humans

Collaborative robots are emerging as a way to help humans avoid doing dull, dirty, and/or dangerous work. When once the conversation was about robots "taking jobs", now we're discussing how robots can augment human potential. In this talk, hear from leading robot designer Pablo Molina whose company Avidbots built the Neo floor-scrubbing robot. Neos are in use in malls, airports, factories, warehouses, schools, and train stations in 15 countries and do a job that traditionally has over 300% turnover--cleaning floors in large commercial or industrial spaces.

 

Before Neo, janitors had to push a heavy floor-scrubbing machine around at night, risking back injury, as well as experiencing job burnout from the tedium and physical demands. Feedback from janitorial teams is that Neo is a welcome member of the team, freeing them up to focus on higher-value work such as cleaning bathrooms or tidying common areas. Neo also navigates secure areas where it's dangerous for a human to enter, such as power plants or warehouses where heavy objects and other robots could injure humans.

 

Neo's job is clearly dull, dangerous, and dirty, but he doesn't mind. But what technologies are needed to build collaborative robots that will be accepted as part of the team--and not seen as a threat? How can roboticists combine advanced vision, the latest sensors, and intuitive hardware and software design to create functional cobots that augment human capabilities, instead of replacing them?

 

Pablo designed Neo from the ground up, iterating through many prototypes before settling on its current design. He will share lessons learned from the design, manufacturing, and implementation process that illuminate the best strategies for creating cobots that will be embraced by their human coworkers.

 

NOTE: Pablo may be able to bring a Neo to demonstrate in the halls.

11:00 am - 11:30 am
The Latest in Deep Learning and Computer Vision
Track: Artificial Intelligence
Eric Danziger

Eric Danziger

CEO
Invisible AI

The Latest in Deep Learning and Computer Vision

Attendees will learn about the latest work in the fields of artificial intelligence, machine learning and deep learning. The talk will be presented by Eric Danziger from Invisible AI, who previously worked in the field of self-driving cars and now leads a company deploying AI-enabled vision sensors used for business intelligence. Eric will discuss the state-of-the-art research in computer vision and highlight the need to commercially deploy this technology today. While self-driving cars may still be years away, computer vision is ready to revolutionize many businesses immediately, especially in the world of supply chain. Conference attendees should participate in this session if they would like to learn more about how AI and computer vision can help their businesses today. The talk will also discuss examples of how other businesses are already deploying computer vision solutions to streamline their operations.

11:30 am - 12:00 pm
Low-Cost High-Speed Bin Picking via Commercial Cameras and AI
Track: Advanced Vision
Paul Thomas

Paul Thomas

TBA
Procter & Gamble

Low-Cost High-Speed Bin Picking via Commercial Cameras and AI

Session description coming soon.

11:30 am - 12:00 pm
Robotic Force/Torque Sensing and Smart Manufacturing
Track: Collaborative Robots
Ian Stern

Ian Stern

Force/Torque Sensor Product Manager
ATI Industrial Automation

Robotic Force/Torque Sensing and Smart Manufacturing

Force/Torque sensors can add value to any robotic application from simple decision making to complex active path modification and detection of process variations. Discover ways to implement force sensing, review the components of a successful application, and how complementary technologies give robots a sense of touch in any environment. Learn how ATIs Force/Torque sensing products can enable new applications and optimize existing processes.

11:30 am - 12:00 pm
Edge Robotics: AI and Automation Meeting at the Edge
Track: Artificial Intelligence
Juan Aparicio

Juan Aparicio

Head Advanced Manufacturing Automation
Siemens Corporate Technology

Edge Robotics: AI and Automation Meeting at the Edge

This talk will motivate the need of more flexible automation and intelligent machines, as a response to the transition from mass production to mass customization in manufacturing. Autonomy, understood as the evolution of automation with AI and Digitalization ingredients, is presented as a solution. But nothing comes for free; autonomous machines will require a powerful brain, who may reside on the cloud or at the edge. The latest results in edge robotics will be presented.

Noon - 2:00 pm
Networking Lunch
2:00 pm - 2:30 pm
Deep Learning Vision Solutions in the Age of Industry 4.0
Track: Advanced Vision
Kyongsoo Noh

Kyongsoo Noh

Director of Global Business
SUALAB

Deep Learning Vision Solutions in the Age of Industry 4.0

1. Overview of Smart Factory Ecosystem
2. Technology Disruption Trends in Machine Vision Field
3. Introduction of Deep Learning and Remaining Problems
4. Deep Learning S/W Library for Machine Vision and its Functions

2:00 pm - 2:30 pm
Intelligent Tools to Shape Faster and Smarter Automation Adoption
Track: Collaborative Robots
Kristian Hulgard

Kristian Hulgard

General Manager - Americas
OnRobot

Intelligent Tools to Shape Faster and Smarter Automation

Smarter and more versatile robotic tools, or end-of-arm tooling (EOAT) sensors, grippers, and quick changers empower robots to take over repetitive tasks, freeing them to handle adaptive, higher-precision and more intelligent applications that in the past were too complex to automate. More importantly, these advanced tools enable collaborative applications that allow workers and robots to operate safely side by side due to the user-friendly nature, intuitive programming and safety features of EOAT-fitted robots.

2:00 pm - 2:30 pm
Positive ROI for short run automation -- finally!
Track: Artificial Intelligence
Ned Semonite

Ned Semonite

Chief Business Officer
Southie Autonomy

Positive ROI for short run automation -- finally!

Industrial robots can save a company money by performing difficult tasks over and over again. Yup we got that.

Now collaborative robots can solve labor shortages by working alongside people, doing simpler tasks, repeatedly.  Yup - got that too.

But the numbers often don't add up when you've got short run tasks performed by relatively unskilled labor or temporary or offshore workers with their low wages.   If that robot isn't constantly productive - doing that one thing over and over - automation is usually more expensive.  People can easily switch from one task to another - we're built that way.  But a robot, has to be re-programmed, fixtures modified and tested thoroughly before it can pick up the next job. 

In this session, we'll provide 3 examples of companies that wanted to automate, but couldn't, because the costs were too high.  That is - until AI was used to lower those costs.  We'll show the costs of their manual processes.  Next, we'll show the costs they had been quoted to automate and how they were higher than what they were paying with just manual labor.  We'll show the added benefits that help the automation equation like the cost of hiring, increasing healthcare, on the job safety, job satisfaction.  Although these are acknowledged, usually they are too difficult to measure and so the CFO is going to take a risk or decide simply based on direct labor cost savings.  And then we'll show how the CFO will buy in when the robot can be used continuously with the use of AI, AR and vision, by unskilled workers, to quickly automate different tasks - without expensive robotic expertise and continued custom fixturing.

2:30 pm - 3:00 pm
How AI, Robotics, Vision, and Industry 4.0 Will Revolutionize Manufacturing
Track: Advanced Vision
Grant Zahorsky

Grant Zahorsky

Technician
Canon USA

How AI, Robotics, Vision, and Industry 4.0 Will Revolutionize Manufacturing

Industry 4.0 is upon us, and with it, a myriad of new and emerging technologies. Machine vision, used in combination with artificial intelligence and deep learning, strives to make manufacturing systems more productive than ever by giving machines eyes with which to see and also a mind with the ability to comprehend its newfound sight. Companies around the world are seeking to improve their businesses by automating their workspaces, thereby increasing the effectiveness of their production. The result is a human-like, self-reliant industrial environment that is more durable and more profitable than we ever thought possible. This presentation aims to educate the audience on the key technologies that continue to revolutionize modern manufacturing. Industry 4.0 is more than just a 21st-century buzzword. It is the beginning of the future of automation and production as we know it.

2:30 pm - 3:00 pm
The Growing Pains of Mobile Robot and AGV Navigation
Track: Collaborative Robots
David Mindell

David Mindell

CEO/Founder
Humatics

The Growing Pains of Mobile Robot and AGV Navigation

Autonomous mobile robots (AMRs) and automated guided vehicles (AGVs) are essential to the Industry 4.0 ideal of smart, flexible factories and logistics centers, but the deficiencies of legacy navigation technologies are limiting their benefits. Magnetic tape and fiducial-based systems are inflexible and damage-prone, while LiDAR systems are expensive and restricted to indoor, structured environments. Vision-based navigation has come a long way but still fails in numerous industrial settings.

So, what's the solution? There is no one size fits all solution across all AGV applications. Industrial AMR applications, in particular, require precise, robust and cost-effective positioning that can only be achieved with sensor fusion. This includes sensors such as LiDAR, vision, IMUs as well as a new category of microlocation products that use radio-frequency-based technology that synthesizes time-of-flight measurements with inertial sensors to determine the 3D location of AMRs. This talk will discuss microlocation in relation to other more well-known navigation-based solutions.

2:30 pm - 3:00 pm
IOT is an Ecosystem Play
Track: Artificial Intellegence
Irene Petrick

Irene Petrick

TBA
Intel

Successful IOT solutions require a complex interplay of HW and SW. Monolithic solution offerings from a single company, while they may feel more manageable, often disregard this interdependence and lead to increased complexity down the road. Future-proofing your business and your operations will require an ever-changing set of ecosystem partners to collaborate. Since no one technology is sufficient, companies with different capabilities will need to work together to develop, deliver and support solutions. In this complex digital world, we need to rethink how we find and vet an evolving set of ecosystem partners. This presentation is rooted in series of studies conducted over 2 years with over 400 participants drawn from manufacturers and the companies that support them.

 

3:00 pm - 3:30 pm
Break
3:30 pm - 4:00 pm
The Biggest Challenges of Bin Picking
Track: Advanced Vision
Jan Zizka

Jan Zizka

CEO
Photoneo

The Biggest Challenges of Bin Picking

Has the development of 2D and 3D technology reached the peak or, on the contrary, is there still room for major advancements? How much fine-tuning do 3D machine vision systems need to undergo to make robot performance 100% reliable? This requires a perfect interplay between high scan accuracy, resolution and scanning speed. To what extent are the developers of 3D machine vision solutions able to meet these requirements and what does the current market have to offer?

 

The field of AI and Machine Learning is also moving forward in giant leaps and, in combination with 3D machine vision systems, finds an increasingly wide array of industrial applications. What methods are there for CAD-based matching on the one hand and picking of unknown items on the other, and where are they currently used and might potentially be used in the future? Another important feature in the context of industrial automation is path planning as it is necessary for autonomous robot performance. Where does its development currently stand and how many companies rely on this robotic ability?

 

And finally, innovations in the field of industrial automation in general and bin picking solutions in particular include new approaches to grasping methods as well as efforts to shorten the cycle times of object picking and placing. Which advancements in this area can we already enjoy and what remains subject to improvement? Answers to these questions come in the form of an insightful overview of current trends in the development of 3D machine vision technologies and solutions applied in bin picking applications.

3:30 pm - 4:00 pm
Breakthrough Applications in the Collaborative Robot Space
Track: Collaborative Robots
Tim DeGrasse

Tim DeGrasse

TBD
Universal Robots

Breakthrough Applications in the Collaborative Robot Space

Session description coming soon.

3:30 pm - 4:00 pm
Cloud Robotics: Pros, Cons, and Everything in Between
Track: Artificial Intelligence
Russell Toris

Russell Toris

Director of Robotics
Fetch Robotics

Cloud Robotics: Pros, Cons, and Everything in Between

Cloud robotics is emerging as a powerful way to implement on-demand automation.In the warehouse and logistics industry, there are many ways to leverage cloud robotics to optimize warehouse operations. Utilizing machine learning in the cloud can provide customers with insight into data analytics that previously wasn't possible. Imagine knowing exactly where every item is in the warehouse, and having a view into how items are moved and perhaps where they get lost. Or, having the robots tell warehouse operators where congestion is in the warehouse or being able to monitor safety incidents. Cloud robotics provides a level of insight into warehouse operations all while the robot itself accomplishes the primary material handling task. Russell Toris, Director of Robotics at Fetch Robotics, will discuss how to leverage the cloud for robotics, what the cloud is good for, what limitations exist today and what the future holds for cloud robotics.

4:00 pm - 5:00 pm
KEYNOTE: Volumetric Technologies for Future Sports Experiences
Sankar "Jay" Jayaram, Intel Sports
Sankar Jayaram

Sankar "Jay" Jayaram

CTO
Intel Sports

Volumetric Technologies for Future Sports Experiences

For the past 75+ years, the sports TV industry has relied on fixed view cameras with a producer/director using inputs from these cameras to tell a story to fans. This concept has fundamentally not changed in these decades. As we head into more interactive, lean forward experiences with active participation by fans and personalization of content and experiences, new technologies such as digitization of playing fields using volumetric capture technologies are poised to bring in a new era of sports experiences.

This talk will focus on Intel’s volumetric technologies and creation of immersive media experiences based on these capabilities. Additional topics will cover the challenges in the entire end-to-end pipeline and the critical solutions needed to make volumetric based real-time sports experiences a reality.

5:00 pm - 7:00 pm
Networking Reception

 

WEDNESDAY, NOVEMBER 13, 2019
7:30 am - 9:00 am
Registration and Breakfast
9:00 am - 9:15 am
Welcome
9:15 am - 10:15 am
KEYNOTE: Closing the Perception-Actuation Loop using Machine Learning: New Perspectives and Strategies
Vincent Vanhoucke, Google
Vincent Vanhoucke

Vincent Vanhoucke

Principal Scientist and Director of Robotics
Google

KEYNOTE: Closing the Perception-Actuation Loop using Machine Learning: New Perspectives and Strategies

Recent advances in perception technology, fueled by progress in deep learning, have materially changed the degree of situational awareness one can expect from robots engaged in the real world: in addition to perceiving the geometry of the world around them, robots can now also reason about its semantics, and communicate intuitively with the people sharing their environment.

Yet, we're arguably still struggling to deploy robots in human-centered environments. Much of the difficulty centers around closing the loop between perception and actuation in a manner that's safe, reliable, precise and flexible. I'll discuss recent progress in machine learning which directly address these challenges and open up new avenues in connecting perception and behaviors in real-world environments.

10:15 am - 10:45 am
Changing the Perception of Archaic Industry
Track: Advanced Vision
Chris Laudando

Chris Laudando

Head of Strategic Partnerships
Motivo

Changing the Perception of Archaic Industry

While Silicon Valley works on the sexy problems of our time (e.g. autonomous vehicles), legacy industries like manufacturing, agriculture, and infrastructure remain largely reliant upon archaic technology. Fully autonomous robot taxis may be further away than we hoped, but the very same tools being developed for autonomous cars can be applied to less technically challenging problems in archaic industry. This session will discuss the marriage of Silicon Valley perception technology to archaic industry problems. By comparing & contrasting the feasibility of perception tech applied to different engineering problems, we'll highlight the extreme financial viability associated with "Big Tech" tools cleverly applied to manufacturing, agriculture, and infrastructure.

10:15 am - 10:45 am
Emerging Applications For Robots on Mobile Bases
Track: Collaborative Robots
Brian Carlisle

Brian Carlisle

CEO
Precise Automation

Emerging Applications For Robots on Mobile Bases

In the last few years many applications have emerged for mobile bases, sometimes called “mobile robots” moving static payloads around factories, warehouses and institutions such as hospitals. More recently, successful applications have emerged for mounting robot manipulators on these mobile bases. Emerging applications include agriculture, machine loading, additive manufacturing, semiconductor tool loading, life sciences and laboratory automation. Application examples will be presented. These new applications place new requirements on robot kinematics, grippers, sensing, the interface between the robot and the mobile base, and the safety of the moving system. Robots, sensing, navigation systems and base platforms will need to be optimized for these new applications.

10:15 am - 10:45 am
Overview of New & Next Technology Advances in Human-Robot Interaction
Track: Artificial Intelligence
Arnie Kravitz

Arnie Kravitz

Chief Technology Officer
Advanced Robotics for Manufacturing (ARM)

Overview of New & Next Technology Advances in Human-Robot Interaction

To achieve the operational goal of automated manufacturing, many tasks will require a combination of the unique skills of humans and robots. This session shares the results of research completed by ARM and its members to identify the key trends, successes and challenges in human-robot interaction.

10:45 am - 11:00 am
Break
11:00 am - 11:30 am
Deep Learning as a Core Technology in Factory Automation
Track: Advanced Vision
Johannes Hiltner

Johannes Hiltner

Product Manager HALCON
MVTec Software GmbH

Deep Learning as a Core Technology in Factory Automation

Factory Automation benefits strongly from the advances in machine learning. Thanks to this technology, many applications can now be solved with higher accuracy or with less efforts. Other applications can now be solved in the first place.

 

This presentation gives an insight into a range of typical real life applications from industries like food, agriculture, automotive, pharmaceutical and electronics. We will show the challenges and explain the benefits deep learning brought to these applications.

11:00 am - 11:30 am
Speed and Separation Monitoring with Functional Safety: A Call to Action
Track: Collaborative Robots
Marek Wartenberg

Marek Wartenberg

Senior Engineer, Robot Dynamics and Controls
Veo Robotics

Speed and Separation Monitoring with Functional Safety: A Call to Action

Current manufacturing trends towards mass customization and faster product cycles means that manufacturers can't amortize the costs of fully automated workcells; instead they need human production workers. The rapid increase in popularity of power and force limited (PFL) robots has shown that collaborative applications can increase productivity, provide faster fault recovery and increase unit production rates. PFL robots perform well for certain applications but are too weak and slow for most durable goods manufacturing, in addition to losing their advantage if equipped with a dangerous end effector or payload.
A vision-based implementation of Speed and Separation Monitoring (SSM) provides a way to overcome these limitations, making large industrial robots aware of humans. Collaborative applications using SSM have much fewer limitations on end effector design as well as robot speed and payload, leaving the robot in a safe state when a dynamically calculated Protective Separation Distance (PSD) is violated. However, the level of achievable performance using SSM is critically dependent on the robotic systems -- both the physics of the manipulator and latency of controllers. In this presentation we will comment on the current trends in manufacturing, the limitations of existing robotic systems, components of the PSD calculation, and how increased performance of robots and robot controllers can improve the functionality of SSM.
Ultimately, safeguarding using SSM works well today, and major manufacturers are buying the technology to implement collaborative applications in large industrial factories. However, there is a general need for low-latency safety-rated robot interfaces that can control, in real time, robot speed and trajectory through outside signals. Veo is seeking to work closely with robot manufacturers and industry participants to develop and execute on a technology roadmap shifting the mindset to a more open model of external control, enabling true real-time human-robot collaborative systems.

11:00 am - 11:30 am
Automation, Digital Workers and the Future of Work
Track: Artificial Intelligence
Matt Vasey

Matt Vasey

Senior Director, Artificial Intelligence
Microsoft

Automation, Digital Workers and the Future of Work

The flywheel of modern business is based on humans consuming and acting on the data that flows from their company's products, services, and production lines. Today we see that half of all companies with digital transformation initiatives are making investments to automate processes, decisions, and actions that have traditionally been the realm of human workers. AI is driving automation in the physical and digital realm. This presentation will explore the impact of AI powered automation on the future of work, and how A3 members can capitalize on this sea change.

11:30 am - 12:00 pm
Effectively Training a Deep Neural Network for Machine Vision
Track: Advanced Vision
Pierantonio Boriero

Pierantonio Boriero

Director of Product Management
Matrox Imaging

Effectively Training a Deep Neural Network for Machine Vision

Deep learning has undeniably taken the machine vision industry by storm. Its apparent simplification of the solution building process certainly justifies the attention, however, misconceptions abound on how to apply the technology to machine vision applications. Training a deep neural network is as much about the knowhow and experience of those doing it as it is about the richness and quality of the software environment used to perform it. This presentation will provide an overview of the activities and considerations needed to train an effective deep neural network for machine vision: from collecting, preparing and using the training data, to setting the hyper-parameters and the potential pitfalls that can adversely affect the process and result.

11:30 am - 12:00 pm
Leveraging Open-Source Tools to Create Visual Programming Capabilities for Robots
Track: Collaborative Robots
 

Jorge Nicho

Research Engineer
Southwest Research Institute

Leveraging Open-Source Tools to Create Visual Programming Capabilities for Robots

This presentation describes a new simplified robot programming method based on visual demonstration of desired motion paths. This method consist of having an operator holds a pencil-shaped pointing tool that is equipped with visual markers that allow a visual system to track its location. The location information along with a graphical interface allow the operator to teach desired robot trajectories that can be executed to perform a variety of tasks. The goal of this project was to investigate the tools, feedback and workflow that can facilitate programming methods that remove the difficulties of traditional robot teaching methods that required many hours of tedious waypoint programming and lack flexibility.

11:30 am - 12:00 pm
Flexible Manufacturing and Collaborative Robotics: Not Your Grandparent's Robot
Track: Artificial Intelligence
Bernardo Mendez

Bernardo Mendez

Sr. Product Manager
Yaskawa Innovation

Flexible Manufacturing and Collaborative Robotics: Not Your Grandparent's Robot

Flexible manufacturing is requiring the robotics industry to redefine how robots get designed and the applications they are used in. In this session we will review what flexible manufacturing is, its benefits and challenges. We will also review the robotics industry in its present state and how it got there, which will help the attendee understand the challenges this industry faces to incorporate the new technologies being applied to manufacturing and logistics (AI, ML,Vision, etc) As well as how new startups and established robotics companies need to collaborate and change to meet the current and future requirements of manufacturing and logistics. Finally, the attendee will learn why collaborative robotics have a significant role to play in flexible manufacturing.

Noon - 2:00 pm
Networking Lunch
2:00 pm - 2:30 pm
Real World Challenges of AI in Vision Applications
Track: Advanced Vision
Dany Longval

Dany Longval

Vice President Worldwide Sales
Teledyne Lumenera

Real World Challenges of AI in Vision Applications

Advancements in artificial intelligence (AI), particularly in deep learning, have further accelerated the proliferation of vision-based applications. Compared to traditional computer vision techniques, deep learning provides IoT developers with greater accuracy in tasks such as object classification. Those promises do not come without challenges. A year ago we embarked on a journey to incorporate AI in our machine vision cameras with a focus on serving traffic applications. In this presentation, we will discuss the challenges we faced along the way, including:

 
  • How to define vision requirements in the context of an AI environment
  • How hardware architectures must be selected based on the task at hand
  • Real world characteristics have a huge impact on result accuracy
  • Using AI combined with traditional computer vision algorithms can prove useful
The focus of this presentation will be on the lessons learned in trying to harness the potential of AI from the point of view of a developer and integrator of vision solutions.
2:00 pm - 2:30 pm
Take an Agile Approach to Accelerate Industry 4.0 Adoption
Track: Collaborative Robots
Benjamin G. Gibbs

Benjamin G. Gibbs

CEO
READY Robotics

Take an Agile Approach to Accelerate Industry 4.0 Adoption

Agile automation applies the principles of Agile software development to industrial automation. An agile automation approach helps organizations quickly deploy robotics. Many automation projects fail due to never-ending scope creep. An attempt to automate one process expands to include upstream and downstream processes until it encompasses the entire factory. By applying an agile methodology, teams implement small, focused projects quickly with built-in flexibility that accommodate rapidly changing requirements and maximize ROI.

2:00 pm - 2:30 pm
Why the Field of Soft Robotics Fundamentally Changes AI
Track: Artificial Intelligence
Carl Vause

Carl Vause

CEO
Soft Robotics

Why the Field of Soft Robotics Fundamentally Changes AI

Session description coming soon.

2:30 pm - 3:00 pm
Sensor Networking for High-Definition Inspection
Track: Advanced Vision
Jonathan Hou

Jonathan Hou

Chief Technology Officer
Pleora Technologies

Sensor Networking for High-Definition Inspection

This presentation introduces high-definition machine vision inspection, which leverages different sensors and edge processing to speed inspection, improve quality, and increase automation. One of the promises of Industry 4.0 is the ability to improve inspection with advanced imaging capabilities. High-definition inspection uses novel software techniques to fuse traditional video images with 3D, hyperspectral, and IR data and AI capabilities to create an augmented view for robotics applications. Leveraging edge processing techniques now in development for Industry 4.0, combined with new transmission technologies, high-definition inspection can simplify networking complexities for advanced robotics systems.

2:30 pm - 3:00 pm
Collaborative Robot Integration at Scale -
How Some Organizations are Moving the Needle while Others are Stuck in "Pilot Purgatory"
Track: Collaborative Robots
Dan Burseth

Daniel Burseth

Vice President of Business Development
Eckhart, Inc.

Collaborative Robot Integration at Scale - How Some Organizations are Moving the Needle while Others are Stuck in "Pilot Purgatory"

As a leading collaborative robot integrator, we're beginning to see major separation between companies that are embracing automation at scale vs. companies stuck in pilot purgatory and living with outdated manufacturing processes. We have insights and best practices to share on how some Fortune 500 OEMs are establishing a competitive advantage on the Industry 4.0 promise while others struggle. The presentation will include discussion of specific applications and approaches to scale from plant to plant as well as best practices for manufacturing technology implementation within large corporate bureaucracies.

3:00 pm - 3:30 pm
Break
3:30 pm - 4:00 pm
Boosting Manufacturing Quality with Smart Visual Inspection
Track: Advanced Vision
Adi Weinberger

Adi Weinberger

VP Sales and Marketing
Kitov.ai

Boosting Manufacturing Quality with Smart Visual Inspection

The concept of fully automated and smart visual inspection process that can significantly drive quality improvement is becoming a reality. With new technology development in areas such as Artificial Intelligence, 3D computer vision, Robotics and Big Data Analytics manufacturers may apply powerful tools to get ahead of the game in achieving top quality, rapid introduction of new products and cost saving. In this presentation Kitov.ai. executive will share few interesting case studies where such technologies have been implemented and will provide insights about the way the new technologies can be utilized to further enhance operational efficiencies. We will end the presentation with an outlook on the future of visual inspection.

3:30 pm - 4:00 pm
Integrated Sensor Solutions For Collaborative Robot Applications
Track: Collaborative Robots
Thomas Knauer

Thomas Knauer

Industry Region Manager - MPE, Americas
Balluff

Integrated Sensor Solutions For Collaborative Robot Applications

Sensors are a critical part of any robot application, including collaborative applications. A wide variety of sensors are used on and around robots including inductive, photoelectric, capacitive, magnetic, RFID, safety and other types of sensors. Integrating these and connecting them to the robot control system and network can present challenges due to multiple/long cables, slip rings, cost to connect, etc. But device level protocols, such as IO-Link and ASI provide simpler, cost-effective and "open" ways to connect these sensors to the control system.

3:30 pm - 4:00 pm
Rewarding the Robots: Reinforcement Learning for Intelligent Manufacturing
Track: Artificial Intelligence
Nick Ciubotariu

Nick Ciubotariu

SVP Software Engineering
Bright Machines

Rewarding the Robots: Reinforcement Learning for Intelligent Manufacturing

This presentation will explore using reinforcement learning (a training method for machine learning models that uses a positive reward to reinforce optimal behavior) in manufacturing, robotics and industrial automation. An overview of reinforcement learning will be provided, as well as requirements, advantages and disadvantages of use. The audience will be presented with several use cases that employ reinforcement learning in manufacturing and robotics, as well as some possible immediate practical applications for reinforcement learning.

4:00 pm - 5:00 pm
KEYNOTE: Present State and Future Directions for Intelligent Vision-Based Collaborative Robots
Henrik Christensen, Robust.AI and UC San Diego
Henrik Christensen

Henrik Christensen

Qualcomm Chancellor’s Chair in Robot Systems, IEEE & AAAS Fellow Director, Contextual Robotics Institute, Professor of Computer Science and Engineering Jacobs School of Engineering @ UC San Diego
Robust.AI and UC San Diego

Present State and Future Directions for Intelligent Vision-Based Collaborative Robots

Collaborative robots have become mainstream in many different applications. Vision has also become a popular tool not only in inspection but also in direct control for material handling and control. Only recently have we seen broad use of artificial intelligence in mainstream robotics. The next leap in performance and usability for robots may well be in the integration of robots, vision and AI. We will discuss current use-cases for collaborative robots, vision and AI and also point to promising new applications for integration of these technologies in a 1- and 5- year perspective.