The Solution for Planetary Exploration

Jerry Qi
7 min readApr 30, 2021
The Vast Space

Space, different planets, are the places we all dream of going. And currently, mars is one of them. Technically, we have checked off mars, through cameras, rovers, and all the pieces of technology that we are sending.

Interplanetary exploration, has been a goal of our society for a long time.

There are many reasons powering our trek on planetary exploration. For now, it is the search for life, understanding the planet and the planet’s evolution, preparing for future exploration/colonization, and different opportunities such as space technologies (mining), health, and potential outer space sustainable energies.

So, why isn’t there a human on mars yet?

Well, there are two main roadblocks, the technology difficulties and hazards. Different hazards include radiation exposure during the trip and at planet surface, toxic soil, low gravity (38% of earth) and cold temperatures
(avg -81 °F).
The Radiation on mars is far greater than that of the earth.

Mars cosmic rays environment

The Mars Odyssey probe detected ongoing radiation levels on mars which are
2.5 times higher than that of the International Space Station — 8000 millirads per year. Other solar photon radiation events on mars are also detected peaking at about 2000 millirads per day.

People in developed nations are exposed to approximately 620 millirads per year. As a comparison to earth, mars has a radiation increase of around 1190.32%.

Prolonged exposure to the levels of radiation detected on Mars could lead to all kinds of health problems — like acute radiation sickness, increased risk of cancer, genetic damage, and even death.

Aside from this, the other major problem is the extremely toxic soil on mars, due to high concentrations of perchlorate compounds such as chlorine.

The levels of calcium perchlorate detected in the martian soil are around 0.5%, toxic to humans and plants

Martian soil and fine dust have potential to gather up in sandstorms making the human trek on mars much less probable.

These factors of difficulty and hazards raises a need for a plausible solution toward planetary exploration.Thats us, BlueSpace

BlueSpace

Logo: BlueSpace

We have envisioned and designed a solution combining VR, AI, Haptics, and tele-robotics to enable astronauts a risk free experience when exploring hazardous planets in the future.

The solution, to be put simply, is a remote control robotics system which allows astronauts to explore planets using advanced robotics and a VR, Haptics immersion system in the form of a wearable, portable full body suit.

Integrated Technologies

There are two parts to the overall product. First, the side of control, the full body suit for the astronaut. And second, the robot, controlled by the astronaut from a home-base from a dedicated range.

VR

Virtual Reality is a large aspect of the control and immersion experience for the astronaut. The full body suit will have integrated VR technologies within the helmet portion to simulate accurate sighting as if the operator was at site.

For the hardware portion, an HTC Vive headset will be integrated in the suit (NASA uses HTC headsets for astronaut training). Apart from sight itself, to immerse the astronaut completely toward the environment, a omni-directional treadmill like the Virtuix Omni. This allows for further interaction for motion, rotation, and height sensors in consideration for all possible terrain, on top of a framerate of 1000hz precise tracking.

Virtuix Mini

On the robot that is deployed, the VR integration will be connected to a modified version of the Mastcam-Z camera which is used on the Mars Perseverance rover.

There will be two cameras on the robot, 9.5 inches apart, creating 3D stereoscopic images of live environment in 3D. Beyond the ability to help the astronaut see the surroundings, the cameras allows for additional functions to help the astronaut gather information (ex. zoom, focus, ultraviolet, infrared). Moreover, the camera is also able to help the astronaut operate by documenting situational occurrences such as weather and natural disasters.

Haptics

The haptics itself will additionally consist of two segments. The feedback on the suit itself and the robot’s haptic receptors.

In order to best simulate the astronaut in the situated environment, haptics receivers will be implemented on the robot, then it would be sent back to the haptic textile on the suit for the astronaut.

Haptic gloves

HaptX DK2 Gloves

The gloves which will be integrated with the suit will be using technology similar to that of the HaptX DK2 Gloves. Utilizing different advanced technology such as the Microfluidic skin and pneumatic actuator to simulate precise tactile feedback.

Specifications include:

  • 200 points of feedback on each glove
  • Can produce 20 Newtons of force on each digit with mN precision
  • Sub-millimetre motion capture precision with a 200 Hz refresh rate. This could be higher or lower depending on how far away the mech is being operated from. Expect latency of up to 15milliseconds
  • Connected to the pneumatic controller via a tether. As long as it needs to be, however may somewhat limit the operator’s movement
  • The pneumatic controller would need to be attached to an air compressor, which takes up about 11,000 cm3 of space and weighs about 25kg
  • Power Consumption: 25V, a maximum of 1500W

The Suit itself

The Suit model

The suit will be covered in haptics, including 100 electro-stimulation channels spread throughout for tactile feedback with a refresh rate of up to 300Hz per channel. There will be 70 electrodes on the upper body of the suit, and 60 on the lower body (pants)at specific anatomic locations to take advantage of neuromuscular electric stimulations.

There will also be 15 internal motion capture sensors with 200 Hz refresh rate each.

Vision for the robot

The model

A robot will be venturing into the environment instead of the operator, who is controlling it through the suit. Although the suit will be shaped for the human operator, the robot will not be in the form of a humanoid structure as movement would become less efficient than that of other forms. Instead, the robot would consist of a mechanism where it would have human-like appendages, such as robust hands and arms so that the operator could interact with their environments using our (human) normal movements.

Along with having arms and hands, there would also be many tools available for the astronaut, used as though they were attached to their hands, for example, drills, shovels, etc.

Aside from human like interaction, the robot would make up for limitations in a human. One of them is movement. It would traverse over terrain as a rover instead of walking, providing the benefit of speed along with a better adaptability toward terrain and topography of landmasses. Controls by the operator will still be conducted in a human manner, but would be translated to be compatible for the robot.

The camera would be equipped with an AI computer vision algorithm that could pick up areas of interest for the person depending on their mission.

Lastly, to help alleviate the strain on the power supply powered by the solar panels, the wheels will be equipped with piezoelectric equipment so that it can build power as it moves.

Arms on the robot

The robot’s hand system will provide 24 points of movements, best mimicing the natural movement dexterity of a human hand:

  • 20 actuated degrees of freedom
  • 4 under actuated degrees of freedom

In order for the robotic hand to be as precise as possible, the hand will be built as a self-contained system, meaning all actuation and sensing is built into the fingers, hands, and forearms.

For communicating with the suit, the hands and all of the haptics receivers on the mech will be taking advantage of EtherCAT bus (Ethernet for Control Automation Technology), providing a 100Mbps Ethernet-based communications field-bus, and full integration into the robotic operating system. It will also have precise motion and position sensing at each degree of freedom as well the strain gauges at every synthetic tendon and pressure and tactile receptors in the fingertips.

However, the hands themselves aren’t enough to create HD tactile feedback. We would install synthetic fingertips on the hands that mimic human fingertips. They would be capable of sensing force, vibration and temperature, essentially everything the mechanoreceptors in the human fingertip can.

Specs on the fingertips:

Modality: Force

  • Up to 70N of force plus-minus 10mN
  • 100 Hz refresh rate

Modality: Fluid Pressure

  • 100kPa plus-minus 20 Pa
  • 1100 Hz refresh rate

Modality: Micro vibrations

  • 800 Pa plus-minus 0.5 Pa
  • 1100 Hz refresh rate

Modality: Temperature

  • Up to 80 C plus-minus 0.1 C
  • 25 Hz refresh rate (refresh rate not as high since temperature isn’t as important as tactile feedback)

Our Vision

Beyond astronaut usage, the BlueSpace robot has endless applications with the potential to disrupt multiple industries on Earth, from using the robot for dangerous fire fighting missions to tourism and exploring the world’s greatest attractions. To realize this vision, technology for the robot can be altered to suit each use case.

There is by combining brain computer interfaces into our tech, people with physical disabilities and paraplegics could control the appendages, and partake in activities and experiences they may not have been able to do so before.

--

--

Jerry Qi

Grade 11 Student at tks.world. Passionate and write about emerging tech.