While Americans are fretting over whether President Trump will oversee a nuclear war with North Korea, the greatest threat to our existence, to our humanity, is coming from inside the house.

On Monday, the robotics company, Boston Dynamics, published a video of its nimble four-legged robot, SpotMini, approaching a closed door, and summoning his "pal" with an outstretched arm to turn the handle of the door and open it for both of them to go through.

While we should encourage science and technological advancement and exploration (i.e. last week's SpaceX Falcon Heavy test flight), we often come across a precipice in the pursuit of advancing society through this type of technology that raises red flags. Boston Dynamics repeatedly crosses this boundary.

It's similar to the primates that were cloned in China in late January. Sure, there might be worthwhile discoveries that help advance medical research for humans, but do the ends really justify the means? It brings up ethical questions about what scientists should or shouldn't do. To quote Jeff Goldblum's Dr. Ian Malcolm in 1993's "Jurassic Park": "scientists were so preoccupied with whether or not they could that they didn't stop to think if they should."

And to those who think there's nothing to worry about with respect to whether our government would allow something like killer robots that use biomatter (i.e. humans) as fuel like in Horizon Zero Dawn, that may not be for us to decide. Our government may impose regulations or restrictions that bar the creation or proliferation of such technology that has the capability to destroy us, but all you need is one place, a safe haven, that will allow something like this to happen.

Boston Dynamics might have the benefit of the doubt that they are building machines to help humans, but we should keep a watchful eye on their operations and maintain a level of oversight that their technology isn't used or even replicated for more nefarious means.

Siraj Hashmi is a commentary video editor and writer for the Washington Examiner.