Over the last week I've been looking at ways to detect the trailing of stars within the images we take. I have looked at a number of approaches to this, and found one that looks promising. I have reworked a piece of code from the telescopes auto focussing routines, and modified the way it works to detect trailed images.
How it works: (the technical bit)
We pass an array of image data in to the new code. This code then copies the data to a new array, this is so that we can do interesting things to the data without damaging the source data.
The image data is then thresholded, so that only the stars remain in the image. (We get a bit of noise there too.. but I'll come to that in a minute). The point of the threshold is one of the tunable parameters for this process.. so we will be working to tune the code as the testing continues.
We then count the number of non-zero pixels in the sub frame of star data. We have a minimum pixel count for stars, below this count the data is normally noise.
If the star is still valid, we iterate through the sub frame and find the minimum and maximum rows and columns containing star data. From this we can calculate the height and width of the star.. and the ratio of the two.
This data is then stored in a vector with all of the other star data to be analysed once all of the stars have been found.
The valid star data from the subframe is then removed from the source image, and the star finding routine repeats until the whole image has been checked. (or we reach a point where too many stars have been found and we give up and flag an error)
Once we have collected information about all of the stars in the image we can then look at this data and try to decide if the image is trailed or not... But I'll describe that in another post.