In this paper, we present a control architecture for an intelligent outdoor mobile robot.This enables the robot to navigate in a Endcap Plate complex, natural outdoor environment, relying on only a single on-board camera as sensory input.This is achieved through a twofold analysis of the visual data stream: a dense structure from motion algorithm calculates a depth map of the environment and a visual simultaneous localization and mapping algorithm Cloth Napkins builds a map of the surroundings using image features.
This information enables a behavior-based robot motion and path planner to navigate the robot through the environment.In this paper, we show the theoretical aspects of setting up this architecture.