Publications

Arthur Wandzel, Panpan Cai, and David Hsu,  Perception for Planning: Integrating Attention into Planning for Deep Robot

Navigation among Crowds, Work in Progress. 

Arthur Wandzel, Seungchan Kim, Stefanie Tellex, and Yoonseon Oh, OO-POMCP: Robust Multi-Object Planning for Object-Oriented POMDPs, Preprint (2019)

High-level human cognition supports reasoning about multiple objects as evident in manipulation or navigation tasks. Object-based reasoning in unstructured and uncontrolled, real-world environments, however, impose a challenge: as the number of considered objects scale, planning becomes increasingly computationally intractable in POMDPs. In this paper, we theoretically and empirically prove the efficiency of Object-Oriented Partially Observable Monte-Carlo Planning (OO-POMCP), an object-oriented online POMDP planner that robustly scales with the number of considered objects. The performance of OO-POMCP generalizes to a class of POMDPs where the model is factored in terms of objects. We first prove the sample efficiency of OO-POMCP via Rademacher complexity by deriving general upper bounds on the number of samples required for online POMDP planning. We second evaluate OO-POMCP on a number of domains relevant for multi-object reasoning and show that OO-POMCP's performance is robust to the number of objects, even when considering low sample sizes, spacial-dependencies between objects, and large observation spaces.

Keywords: object-based reasoning, Rademacher complexity, POMDPs, planning

Arthur Wandzel, Yoonseon Oh, Michael Fishman, Nishanth Kumar, Lawson L.S. Wong, and Stefanie Tellex,  Multi-Object Search using Object-Oriented POMDPs, IEEE International Conference on Robotics and Automation (ICRA 2019)

A core capability of robots is to reason about multiple objects under uncertainty. Partially Observable Markov Decision Processes (POMDPs) provide a means of reasoning under uncertainty for sequential decision making, but are computationally intractable in large domains. In this paper, we propose Object-Oriented POMDPs (OO-POMDPs), which represent the state and observation spaces in terms of classes and objects. The structure afforded by OO-POMDPs support a factorization of the agent’s belief into independent object distributions, which enables the size of the belief to scale linearly versus exponentially in the number of objects. We formulate a novel Multi-Object Search (MOS) task as an OO-POMDP for mobile robotics domains in which the agent must find the locations of multiple objects. Our solution exploits the structure of OO-POMDPs by featuring human language to selectively update the belief at task onset. Using this structure, we develop a new algorithm for efficiently solving OO-POMDPs: Object- Oriented Partially Observable Monte-Carlo Planning (OO- POMCP). We show that OO-POMCP with grounded language commands is sufficient for solving challenging MOS tasks both in simulation and on a physical mobile robot.

Keywords: multi-object search, grounded language, POMDPs, reinforcement learning, planning

Steven J. Jones, Arthur Wandzel, and John E. Laird,  Efficient Computation of Spreading Activation Using Lazy Evaluation, International Conference on Cognitive Modeling (ICCM 2016)

Spreading activation is an important component of many computational models of declarative long-term memory retrieval but it can be computationally expensive. The computational overhead has led to severe restrictions on its use, especially in real-time cognitive models. In this paper we describe a series of successively more efficient algorithms for spreading activation. The final model uses lazy evaluation to avoid much of the computation normally associated with spreading activation. We evaluate its efficiency on a commonly-used word sense disambiguation task where it is significantly faster than a naive model, achieving an average time of 0.43ms per query for a spread to 300 nodes.

Keywords: cognitive modeling, probabalistic methods, memory retrieval, SOAR