SAR imagery does not have amazing spatial resolution, but is often good enough to do things like identify shipping. Water is a uniquely flat surface, so metal objects floating on water give a good return against a low background signal*. Some computationally demanding image processing later, and you can pick out ship locations. I've got a vague idea that it could be interesting to find ships in the territorial waters of North Korea, correlate against AIS tracks, and try to find some sanction-busting shipping running dark without AIS.
*This makes me wonder: the USSR really struggled with power requirements for the radar on its RORSAT ocean-monitoring satellites, to the point that it ended up having to power them using the only nuclear reactors to be launched into space. Why is SAR so much more efficient?
@spinflip If you have any questions about processing that data, hit me up! I might be able to give some pointers.
Are you already familiar with opencv?
@pkok opencv? Not at all. What is it?
And I may well do that! Is your experience in imagery, SAR, python in general, SNAPPy in particular?..
@spinflip Python in general, and have some specific computer vision knowledge ;)
OpenCV is *the* open source computer vision library that is used by the major part of the scientific community. If you know what you want to find in your images (e.g., changes over time, recognize roads/lines/..., inspect region sizes), let me know and I'll try to give an informative toot about that.
@pkok ah, I think I might actually have heard of that! Right now I'm still very much focused on data collection + processing. Image recognition would be cool once I've started to stack processed TIFFs up.
I'm going to be away from my (shitty, struggling) computer this weekend, but I might have a few python Qs for you next week. My experience so far has all been in data analysis, and the whole http/API/file management sphere is completely new to me
@spinflip Yeah, I'm not big on that part as well. Hit me up whenever you like!
@spinflip could chalk it up to the fact that america is not alot but defiantly technologically ahead of russia and the majority of the world ( give or take a few choice contries). id wager to say that because of when it was the ussr they didn't advance well compared to everyone else so when they became whatever version of democracy they are now they're playing catch up.
@Tsunachi well, I'd say it's probably non-trivial to put a nuclear reactor-driven high power radar setup into space. SAR seems to do the same job more efficiently, so I'd be interested to know why RORSAT didn't use a similar setup. I assume it was impossible at the time, but why is that so?
@spinflip could it be advances in sensors? All things being equal though, SAR combines multiple scans of the same area so things with a snr of higher than the background will tend to stand out more since some of the background noise will cancel out as you combine scans.
@drewfer I'd guess so, but which advances in particular? I'd guess it's most likely to be linked to onboard data storage and processing, but have no real evidence to support that
@spinflip My gut suspicion is TR module tech changes. I found a IEEE article[1] that provides a little support but I don't know enough about the specific platforms to verify that. I'm not a radar engineer but I'm friends with several. I'll ask around if you want specific answers.
Anyway, that's all background. Today I spent a few hours trying to set up python scripts to find the most recent Sentinel passes over a given geographical location and download the associated imagery products. This would almost certainly be trivial for anyone with a proper computing background, but for this procrastinating chemist the steps involved learning how to:
• Find the relevant API and options: ✔️, not too bad
• Make an authenticated HTTP call to the constructed URI ✔️
• Parse the XML returned into a pandas dataframe containing imaging modes/acquisition time and date/unique ID for each frame containing the specified lat/lon location ✔️
• Download each ~1.8 GB image into a folder for processing. Still in progress: I've found out how to make an authenticated http GET request that *should* be streaming the retrieved file to storage, but the connection seems unstable and I can't get the download to finish. Sorting this out is the next thing I need to do: when I've got a way to download and archive the specified files, it'll be time start looking into automated processing and analysis. Still not even really sure what I'm trying to achieve, but I'm having fun and learning stuff so far!