[Theory] REMINDER: 1/13 Thesis Defense: Andrea Daniele, TTIC

Mary Marre mmarre at ttic.edu
Thu Jan 12 12:50:40 CST 2023


*When*:    Friday, January 13th from *10:30 am - 12:30 pm CT*

*Where*:  Talk will be given *live, in-person* at
              TTIC, 6045 S. Kenwood Avenue
              5th Floor, *Room 530*

*Virtually*: attend virtually *here
<https://uchicagogroup.zoom.us/j/92823957638?pwd=aTAzYndmNjVWdS90L1ZacjlmZk1vZz09>*

*Who*:       Andrea Daniele, TTIC

------------------------------
*Title:      *Accessible Interfaces for the Development and Deployment of
Robotic Platforms

*Abstract: *Accessibility is one of the most important features in the
design of robots and their interfaces. Accessible interfaces allow
untrained users to easily and intuitively tap into the full potential of a
robotic platform. This thesis proposes methods that improve the
accessibility of robots for three different target audiences: consumers,
researchers, and learners.

In order for humans and robots to work together effectively, they both must
be able to communicate with each other to convey information, establish a
shared understanding of their collaborative tasks, and to coordinate their
efforts. Natural languages offer a flexible, bandwidth-efficient medium
that humans can readily use to interact with their robotic companions.

We work on the problem of enabling robots to understand natural language
utterances in the context of learning to interact with their environment.
In particular, we are interested in enabling robots to operate articulated
objects (e.g., fridge, drawer) by leveraging kinematic models. We propose a
multimodal learning framework that incorporates both vision and language
acquired in situ, where we model linguistic information using a
probabilistic graphical model that grounds natural language descriptions to
their referent kinematic motion. Our architecture then fuses the two
modalities to estimate the structure and parameters that define kinematic
models of articulated objects.

We then turn our focus to the development of accessible and reproducible
robotic platforms for scientific research. The majority of robotics
research is accessible to all but a limited audience and usually takes
place in idealized laboratory settings or unique uncontrolled environments.

Owing to limited reproducibility, the value of a specific result is either
open to interpretation or conditioned on specific environments or setups.
In order to address these limitations, we propose a new concept for
reproducible robotics research that integrates development and
benchmarking, so that reproducibility is obtained “by design” from the
beginning of the research and development process. We first provide the
overall conceptual objectives to achieve this goal and then a concrete
instance that we have built: the DUCKIENet. We validate the system by
analyzing the repeatability of experiments conducted using the
infrastructure and show that there is low variance across different robot
hardware and labs.

We then propose a framework, called SHARC (SHared Autonomy for Remote
Collaboration), to improve accessibility for underwater robotic
intervention operations. Conventional underwater robotic manipulation
requires a team of scientists on-board a support vessel to instruct the
pilots on the operations to perform. This limits the number of scientists
that can work together on a single expedition, effectively hindering a
robotic platform’s accessibility and driving up costs of operation. On the
other hand, shared autonomy allows us to leverage human capabilities in
perception and semantic understanding of an unstructured environment, while
relying on well-tested robotic capabilities for precise low-level control.
SHARC allows multiple remote scientists to efficiently plan and execute
high-level sampling procedures using an underwater manipulator while
deferring low-level control to the robot. A distributed architecture allows
scientists to coordinate, collaborate, and control the robot while being
on-shore and at thousands of kilometers away from one another and the robot.

Lastly, we turn our attention to the impact of accessible platforms in the
context of educational robotics. While online learning materials and
Massive Open Online Courses (MOOCs) are great tools for educating large
audiences, they tend to be completely virtual. Robotics has a hardware
component that cannot and must not be ignored or replaced by simulators.
This motivated our work in the development of the first hardware-based MOOC
in AI and robotics. This course, offered for free on the edX platform,
allows learners to study autonomy hands-on by making real robots make their
own decisions and accomplish broadly defined tasks. We design a new robotic
platform from the ground up to support this new learning experience. A
completely browser-based experience, based on leading tools and
technologies for code development, testing, validation, and deployment
(e.g., ROS, Docker, VSCode), serves to maximize the accessibility of these
educational resources.

*Thesis Advisor*: *Matthew Walter* <mwalter at ttic.edu>




Mary C. Marre
Faculty Administrative Support
*Toyota Technological Institute*
*6045 S. Kenwood Avenue, Rm 517*
*Chicago, IL  60637*
*773-834-1757*
*mmarre at ttic.edu <mmarre at ttic.edu>*


On Thu, Jan 5, 2023 at 1:15 PM Mary Marre <mmarre at ttic.edu> wrote:

> *When*:    Friday, January 13th from *10:30 am - 12:30 pm CT*
>
> *Where*:  Talk will be given *live, in-person* at
>               TTIC, 6045 S. Kenwood Avenue
>               5th Floor, *Room 530*
>
> *Virtually*: attend virtually *here
> <https://uchicagogroup.zoom.us/j/92823957638?pwd=aTAzYndmNjVWdS90L1ZacjlmZk1vZz09>*
>
> *Who*:       Andrea Daniele, TTIC
>
> ------------------------------
> *Title:      *Accessible Interfaces for the Development and Deployment of
> Robotic Platforms
>
> *Abstract: *Accessibility is one of the most important features in the
> design of robots and their interfaces. Accessible interfaces allow
> untrained users to easily and intuitively tap into the full potential of a
> robotic platform. This thesis proposes methods that improve the
> accessibility of robots for three different target audiences: consumers,
> researchers, and learners.
>
> In order for humans and robots to work together effectively, they both
> must be able to communicate with each other to convey information,
> establish a shared understanding of their collaborative tasks, and to
> coordinate their efforts. Natural languages offer a flexible,
> bandwidth-efficient medium that humans can readily use to interact with
> their robotic companions.
>
> We work on the problem of enabling robots to understand natural language
> utterances in the context of learning to interact with their environment.
> In particular, we are interested in enabling robots to operate articulated
> objects (e.g., fridge, drawer) by leveraging kinematic models. We propose a
> multimodal learning framework that incorporates both vision and language
> acquired in situ, where we model linguistic information using a
> probabilistic graphical model that grounds natural language descriptions to
> their referent kinematic motion. Our architecture then fuses the two
> modalities to estimate the structure and parameters that define kinematic
> models of articulated objects.
>
> We then turn our focus to the development of accessible and reproducible
> robotic platforms for scientific research. The majority of robotics
> research is accessible to all but a limited audience and usually takes
> place in idealized laboratory settings or unique uncontrolled environments.
>
> Owing to limited reproducibility, the value of a specific result is either
> open to interpretation or conditioned on specific environments or setups.
> In order to address these limitations, we propose a new concept for
> reproducible robotics research that integrates development and
> benchmarking, so that reproducibility is obtained “by design” from the
> beginning of the research and development process. We first provide the
> overall conceptual objectives to achieve this goal and then a concrete
> instance that we have built: the DUCKIENet. We validate the system by
> analyzing the repeatability of experiments conducted using the
> infrastructure and show that there is low variance across different robot
> hardware and labs.
>
> We then propose a framework, called SHARC (SHared Autonomy for Remote
> Collaboration), to improve accessibility for underwater robotic
> intervention operations. Conventional underwater robotic manipulation
> requires a team of scientists on-board a support vessel to instruct the
> pilots on the operations to perform. This limits the number of scientists
> that can work together on a single expedition, effectively hindering a
> robotic platform’s accessibility and driving up costs of operation. On the
> other hand, shared autonomy allows us to leverage human capabilities in
> perception and semantic understanding of an unstructured environment, while
> relying on well-tested robotic capabilities for precise low-level control.
> SHARC allows multiple remote scientists to efficiently plan and execute
> high-level sampling procedures using an underwater manipulator while
> deferring low-level control to the robot. A distributed architecture allows
> scientists to coordinate, collaborate, and control the robot while being
> on-shore and at thousands of kilometers away from one another and the robot.
>
> Lastly, we turn our attention to the impact of accessible platforms in the
> context of educational robotics. While online learning materials and
> Massive Open Online Courses (MOOCs) are great tools for educating large
> audiences, they tend to be completely virtual. Robotics has a hardware
> component that cannot and must not be ignored or replaced by simulators.
> This motivated our work in the development of the first hardware-based MOOC
> in AI and robotics. This course, offered for free on the edX platform,
> allows learners to study autonomy hands-on by making real robots make their
> own decisions and accomplish broadly defined tasks. We design a new robotic
> platform from the ground up to support this new learning experience. A
> completely browser-based experience, based on leading tools and
> technologies for code development, testing, validation, and deployment
> (e.g., ROS, Docker, VSCode), serves to maximize the accessibility of these
> educational resources.
>
> *Thesis Advisor*: *Matthew Walter* <mwalter at ttic.edu>
>
>
>
> Mary C. Marre
> Faculty Administrative Support
> *Toyota Technological Institute*
> *6045 S. Kenwood Avenue, Rm 517*
> *Chicago, IL  60637*
> *773-834-1757*
> *mmarre at ttic.edu <mmarre at ttic.edu>*
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mailman.cs.uchicago.edu/pipermail/theory/attachments/20230112/148d5821/attachment-0001.html>


More information about the Theory mailing list