laitimes

I, in 17 days, shot the International Space Station 400 kilometers away with 600 lines of code

Bowen Xiao Zhen was sent from The Temple of Consort

Qubits | Official account QbitAI

How do you take such a photo of a space station 400 kilometers away?

As the largest man-made object in space, the International Space Station has always been the dream object of countless astronomical enthusiasts.

The method used by most people is to calculate the time of transit of the space station (ranging from a few seconds to a few minutes) according to the latitude and longitude of their own longitude and latitude, and then shoot with the mobile phone or micro-single, and then snap - to get such a small point of light:

I, in 17 days, shot the International Space Station 400 kilometers away with 600 lines of code

A slightly more advanced approach would be to use the "transiting moon" shooting method, that is, before the space station flew over the sun or moon, using these two celestial bodies as a background source to determine a smaller area.

Then wait at the center line, holding a large-aperture, long focal length telescope weighing several kilograms to tens of kilograms for a long time of stable tracking, until the space station passes the moon or the sun within 1 second, press the shutter to shoot.

I, in 17 days, shot the International Space Station 400 kilometers away with 600 lines of code

This shot will be sharper, but the fatal problem remains:

The space station orbits the Earth every 90 minutes on average. Compare it with Stellarium, a well-known simulation tool for starry sky observation, which is like a "still life" (real-time simulation, no acceleration) compared to the space station:

In short, the space station is running too fast.

As a result, not only is the number of frames shot in 1 second limited (atmospheric shake cannot be overcome with multi-frame overlays), only silhouettes can be captured, and it is even likely that the opportunity to shoot is missed due to calculation errors.

If you want to take more accurate photos, you must not only have extraordinary patience, but also have a deep and sophisticated manual control technique.

For example, once you encounter a strong wind, the effect is easy to "paste together":

△ Tuyuan Wang Zhuoxiao, authorized

So at this time, someone had an epiphany:

Without the Unicorn Arm, wouldn't it be nice to use code to make the telescope move on its own?

When all is said and done, this "amateur programmer" immediately exploded for 17 days and developed an automatic tracking system.

With the blessing of this automatic tracking system, the telescope is no longer able to take only a few still images for a specific few seconds, but continues to follow the space station for 2 minutes.

In the end, multiple images were superimposed and post-processed to synthesize a high-precision stereoscopic GIF image:

(That's the picture we started with)

It is such an image that makes netizens issue a sigh of "directly opening the era of artificial celestial tracking photography".

So, we found the developer himself, Peking University astronomy alumnus, astrophysics doctor Liu Boyang chat.

What is the difficulty in shooting a high-precision space station?

First of all, it is necessary to briefly understand the "timing" of shooting the space station.

Although the space station moves extremely fast, circling the earth on average every 90 minutes, and the average altitude is about 400 kilometers from the earth, which is visible to the naked eye, we cannot observe it at any time.

There are two main constraints: field of view and observation time.

The field of view refers to the space station flying to the range visible to our field of vision, that is, the time that happens to "transit";

Observation time refers to the timing when we can observe the space station. The space station itself does not emit light, and only within two hours of sunset or two hours before sunrise is the space station reflecting the brightest sunlight, which is most suitable for shooting.

Only when these two conditions are met at the same time, we have the opportunity to observe and photograph the space station on the ground, but the effect is also affected by factors such as weather (as shown in the picture, we encountered cloudy weather):

△ Source Zhu Yijing & Xu Chengcheng, authorized

However, several common celestial shooting methods that exist are actually not suitable for taking higher-precision space station photos.

The first method is to shoot directly through a "hand cranked" telescope, that is, to push the telescope to track the celestial body.

This approach has a flaw that it is impossible to shoot very high-definition space stations. Because the shooting must rely on manual tracking, it can not be directly found with a telephoto lens, otherwise it is like using a microscope to capture a high-speed moving ant, and the space station disappears out of the lens without paying attention.

The second method is to set up a variety of high-definition lenses and equipment like "waiting for the rabbit", and wait for the space station to "pass by" in the same place.

This method does not require moving the lens, but in turn waits for the space station to "pass" by itself. But it also faces some new problems, such as the space station "passing" for a short time, sometimes only a few seconds, it is likely to be unable to capture; even if the captured shot, due to the inability to adjust the angle and other reasons, the effect can not be guaranteed.

So, why not shoot with the tracking function that comes with the telescope?

This function is usually only suitable for tracking the rise and fall of celestial bodies such as the sun, moon, planets, stars and other celestial bodies due to the rotation of the earth, after all, they do not move fast and are basically synchronized with the rotation of the earth. But for high-speed moving objects like the space station, the telescope can't catch up on its own.

Therefore, in the end, it is necessary to rely on program assistance to achieve high-precision tracking shooting of the space station.

The third method is to use the orbital root number (i.e., orbital parameters) tracking, that is, to use the celestial information found on various astronomical websites (such as Riverns-Above, etc.) to adjust the tracking path of the telescope and make manual corrections:

At present, most astronomy enthusiasts use this method to achieve tracking + fine-tuning, and there are already some relatively mature programs on the Internet, such as using an electric theodolite to track the effect of the space station according to orbital parameters:

But you never know whether these astronomical websites are updated or not in a timely manner. Sometimes the space station temporarily adjusts its orbit, but the website is not updated, and your program will not work.

Using optical recognition, the error is controlled within 4 pixels

All the above problems, as an old astronomy fan, Liu Boyang cannot fail to understand.

His initial idea was to find the "light spots" in the lens through some existing software, and to identify and track the target based on optical recognition methods.

However, when he looked for the right programs, he found that these programs were either unprotected (even the Windows version was too old to use), or the update was not timely and the system was complex, or it was simply closed source charges.

Therefore, Liu Boyang finally decided to get started himself, write an automatic tracking script for optical recognition, and manually find the space station after controlling the tracking based on PID.

His plan was divided into two steps:

The first step, writing a program to automatically identify and track the space station, took 5 days to complete.

It is worth mentioning that optical recognition is not Liu Boyang's "first-hand choice".

He did think about tracking with parameters + manual fine-tuning, including using the joystick to control the equatorial speed steplessly, as well as using the number of track roots for rough heels, combined with the gamepad stepless fine-tuning, etc., but the test shooting effect was not ideal (the hand was not stable enough when fine-tuning).

Therefore, based on the PID control principle, he wrote an optical tracking method. This is a very classic control algorithm, PID refers to the proportion, integration and differential units, such as keeping the 2-wheel machine car balanced, using this algorithm.

Liu Boyang had not learned this knowledge before, but in order to establish a stable automatic control system, he naturally introduced the proportional unit (P) and the integral unit (I) to reduce the error of the system.

Liu Boyang's telescope is divided into two parts: a star-seeking mirror with a large field of view and a main mirror with a small field of view. The basic goal of this algorithm is to calculate the magnitude of its deviation from the main mirror field according to the position of the current space station in the starfinder, so as to adjust the telescope tracking speed to correct the deviation and make the space station fall into the main mirror field of view.

With this program, you can make the starfinder quickly follow the moving space station "light source", so that the space station is always in the center of view. Liu Boyang tried to use a laser pointer to create a bright spot moving at a uniform speed on the wall of his home, simulating the movement of the space station.

The program itself, based on a platform called ASCOM, was developed.

It can integrate all the configurations of astronomical equipment, such as the focuser that controls the telescope, the rotation of the filter, and the opening and closing of the camera, all on a single piece of software, and is a very widely used software interface standard in the field of astronomy:

Hardware preparation, in addition to laptops, includes:

11-inch aperture, focal ratio f/10, fold-reflective Star-Trann EdgeHD telescope with CGEM equatorial mount

Canon EOS R5 camera

QHY5III462c camera, as a guide star camera

Tumast T16000M game controller

Among them, the telescope is about 40,000 yuan, the Canon EOS R5 camera was rented for two weeks to spend 2200 yuan (the market price is 25,000 yuan), the 462c camera is less than 1,000 yuan, and the handle is exchanged with friends (the market price is more than 500 yuan).

The entire cost is less than 45,000 yuan, according to Liu Boyang, if the accuracy requirements are not so high, the whole set can be done with less than 10,000 yuan.

Next, we entered the second step, took a live shot on the spot and successfully used the equipment to take a high-precision photo of the space station.

But what I didn't expect was that the actual shooting was more difficult than imagined, during which Liu Boyang "has been repeatedly trying and making mistakes to fix bugs."

His initial goal was to capture the Chinese space station, but two bugs in a row resulted in missing the best time for two observations.

On March 23, automatic optical tracking failed to work due to the failure to focus in time; on March 27, because the field of view of the starfinder was only about 3°, the small field of view led to the initial capture failure, and again failed to enter the automatic tracking process.

At this time, it is still a long time before the next visible transit of the Chinese space station. Therefore, after fixing the operational problems (increasing the starfinder field of view to 15°), Liu Boyang decided to first use the International Space Station, which is about to usher in several excellent transits, to "practice".

So, after changing the "capture" in the automatic tracking program to a manual trigger, Liu Boyang successfully captured the International Space Station on April 2.

Although there are still imperfections, such as software crashes that cause the position calibration data of the star finder and the main mirror to be lost, Liu Boyang has added a calibration data recording function for this problem.

By this time, the code had gone from the original 400 lines to 600 lines.

Finally, on the evening of April 3, after urgently fixing the bug, Liu Boyang successfully captured the International Space Station.

Specifically, the telescope's capture of the space station is divided into x and y axes, after pressing the catch, the y axis quickly kept up with the target, and the x axis was slightly slower by 10 seconds.

At about 30 seconds, both axes remain within a stable error range (about four pixels), and this high-precision tracking lasted a total of 120 seconds, completely recording the entire process of the International Space Station from approach to far away:

The original image was initially about 100 pixels, but eventually, after oversampling the multi-frame oversampling, the pixels of the image were increased to more than 200 pixels.

Finally, a series of 300×300 pixel images (composite GIFs) were successfully output after processing:

And this is the 17th day that Liu Boyang began to do this project.

After that, small rockets are launched

When talking about the most difficult stage of the whole project, Liu Boyang was most impressed by how to make the telescope called by Python code:

For someone like me who is poorly programmed, development is a complete black box at first.

Liu Boyang studied undergraduate and doctoral degrees at Peking University and the University of Western Australia respectively, both majoring in astrophysics.

This major requires basic programming skills, but Liu Boyang's related courses in college, such as computer introduction and data structure, are all low scores or deferred examinations.

At the Doctoral level, there was a lot of data processing work that needed to be done with scripts, and he began to learn programming languages in depth.

The reason why I chose to write my own code to control the telescope this time, in addition to not finding ready-made software, is also to continue to exercise my programming ability.

So will this set of code be open source?

When we asked this question, Liu Boyang said:

At least to the extent that you can debug, optimize the code to a degree that you are satisfied with before you consider the next step.

One of his most recent targets now is another transit of the Chinese space station in two weeks' time.

After successfully "practicing his hand" with the International Space Station, Liu Boyang was full of confidence and was still considering whether to appropriately reduce the field of view in the next capture, thereby improving the accuracy of the shooting.

Filming of the Chinese space station, if successful, will end before April 21, after which he will rush to Qinghai to embark on a new project: launching a small rocket with his own camera.

Further afield, Liu Boyang also mentioned that there may be a Shenzhou series of rockets in the second half of this year, as well as the launch of the experimental module, he will take his own set of tracking space station procedures, and then go to follow the big rocket.

Such a hard-core "preparation" plan is undoubtedly a suitable for an avid space enthusiast.

Liu Boyang finally said this:

Astronomy has been interested since childhood, and therefore I have read astrophysics for both master's and doctoral degrees.

However, with the increasing number of space missions for China, I have more and more opportunities to contact related activities, so the interest in space has gradually developed, and now it has developed into a major hobby.

Reference Links:

[1]https://weibo.com/1144755982/LmV8Cp72V

[2]https://zhuanlan.zhihu.com/p/493080686

[3]https://www.heavens-above.com/orbit.aspx?satid=25544

[4]https://mp.weixin.qq.com/s/gNueq8lDQz_86Ifuw8n6Pg#rd

Read on