Applications deployed at the edge are often subject to critical Quality of Service (QoS) objectives, such as meeting deadlines while optimizing for energy consumption. To design and operate middleware that satisfies these QoS objectives, it is crucial to understand the runtime and power consumption characteristics of the edge platform. However, while edge platforms are frequently deployed in environments where ambient factors cannot be controlled, most characterizations are performed without considering environmental factors. We characterize the impact of ambient temperature on the power consumption and runtime of machine learning inference applications running on a popular edge platform, the NVIDIA Jetson TX2. Our rigorous data collection and statistical methodology reveals a sizeable ambient temperature impact on power consumption (about 20% on average, and up to 40% on some workloads) and a moderate impact on runtime (up to 5%).