Astrophotography Under City skies

PowerPoint presentation given to Imperial College London

Imaging the Planet Mars

Mars - 12th September 2020

Mars - 12th September 2020

With Mars reaching opposition on the 13th October this year (2020) now is the time to get imaging ‘The Red Planet’ while it is at it’s biggest and brightest for many years to come.

Mars has been a great target since early August and this will continue until December at least, in fact November and December offer the best times to view the planet if you don’t like late nights or early mornings, especially if you are not an obsessive like me who doesn’t mind getting out of bed at 3 am to view and image the planet in the months before opposition!

This post will hopefully give you a few tips on imaging this beautiful planet from capture right through to final processing, hopefully you’ll find it useful.

The Imaging Train

The information in this post relates to my setup which is a Celestron Edge HD11 scope, Televue 2.5x Powermate & ASI174MM camera. Filters mentioned are Baader 685nm IR and ZWO R,G,B. Images are captured using filters and a mono camera, colour images being created in PhotoShop CC. Software used is AutoStakkert 3, Registax 6, WinJupos (all free for non commercial use) and PhotoShop Creative Cloud.

Photo 13-09-2020, 11 45 37.jpg

The picture above shows my imaging train for planetary work. From top to bottom it shows Televue 2.5x Powermate, ZWO ADC (I tend not to use it for Mars when it is riding high in the sky as it is at the moment so would typically remove it, when using I remove the flip mirror), flip mirror & eyepiece, filter wheel, ZWO ASI174MM camera. Top of the picture shows the electronic focuser. I’ll go into a little more detail on the purpose of each of these below:

Focus - It goes without saying that if the data you are capturing is not properly focused the end result, no matter how much post processing you do, will be disappointing. When imaging at long focal lengths it can be a challenge to achieve that optimal focus especially if you have to touch the telescope to turn the focusing knob. An electronic focuser really helps as it enables you to focus without touching the telescope and makes finding fine focus relatively easy. I can’t recommend getting one enough if you are serious about high resolution imaging. For Mars focus on the bright limb of the planet and achieve fine focus on the albedo features. Remember you will need to re-focus after each filter change, even if you believe your filters to be parfocal.

Image Size - When imaging any planet you want it to occupy as much of your chip as possible so image amplification is essential. I use a 2.5x Televue Powermate which increases the focal length of my setup to 7000mm providing a decent image scale when paired with my Celestron Edge HD11 scope.

Filters - As I’m imaging using a mono planetary camera I have to place filters in front of it to ultimately achieve a colour image. The filters I use for Mars are 685nm IR, Red and Blue (more on this later!)

Camera - For best results a high frame rate planetary camera is a must, I would recommend a mono camera for best resolution but some colour cameras do produce good results and are easier to use but perhaps at the sacrifice of versatility & resolution.

Capturing Your Data

For image acquisition I use SharpCap Pro but you can also use FireCapture which is many people’s preferred capture software, either of these does a good job. I started off my planetary imaging journey using SharpCap and feel most comfortable with it so haven’t felt the need to change.

How much data should I capture? At the moment Mars is an absolute joy to image for us northern hemisphere astrophotographers, it is at a good altitude and bright, something we have been starved of over the past few years with both Jupiter and Saturn skimming our horizons! A benefit of this is that you can capture lots of data in a very short period of time. I aim to image five sets of 4000 frames through IR (optional), Red and Blue filters, you can also image through a green filter but I like to create a synthetic green channel (more in the final processing section). Once done you will have generated 20,000 frames through each filter, 60,000 in total or 80,000 if you are also shooting a green channel! Sounds a lot but you can capture all this within 10 minutes.

What settings should I use? Everyone’s setup is different so settings will differ as well, however I generally aim for:

  • Histogram at ~50% (25% for blue channel)

  • Gain at ~50%

  • Frame rate around 150 fps

Obviously each of these influences the other but as we are capturing so much data you can use a reasonably high gain setting to get yourself within the other parameters I’ve stated.

Capturing Mars under reasonable seeing using SharpCap - 685nm IR Filter

Capturing Mars under reasonable seeing using SharpCap - 685nm IR Filter

Below you can see the .ser files produced by the imaging run shown above. SharpCap creates a directory on your desktop called ‘SharpCap Captures’ and creates sub-directories dependent on the date on which you are imaging. In addition if you provide a name for the target you are shooting it will also create a sub-directory with that name, in this case Mars. All this is configurable in the application as well as the file naming convention you wish to apply to your data files, make sure you choose WinJupos format as this will save you some time later.
Below you can see five .ser files have been created for the IR data, I then changed filters, re-focused and generated five for the red data and same again for the blue. As the whole imaging run only took sseven minutes I captured each set as a batch to minimise filter change and re-focus time.

Now the data has been captured you are ready to move on to the processing stage.

File structures.JPG

Stacking and Sharpening the Data

The first processing step is to stack the videos you have captured, to do this I recommend using AutoStakkert 3. The image below shows one of the captured .ser files open in the application, it shows how I would normally set the various options on the screen. The only item I may change is the frame percentage to stack as this will vary depending on the quality of the input video. I try to keep it around 50% but always perform an analyse on the image first to confirm this is OK, if not I’ll reduce the percentage accordingly. Another point to note is that you don’t need masses of alignment points, here I just have 9 selected by the software with multi-scale checked. A large number of small alignment points can in fact often produce a much worse result.

Once I’m happy everything is set correctly I will stack the first file, once this is complete I’ll select the others in file manager and simply drag them onto the open AutoStakkert window and hit stack again. This will process the other files as a batch using the same parameters as the first video - at this point you can go off and have a coffee!

Autostakkert Mars.JPG

AutoStakkert will create a sub-directory within the directory holding the videos as shown in the image below. The name relates to the percentage of frames you stacked, or the number of frames depending on the stack option selected.

As processed.JPG

These files are now ready to process in Registax. Before I start processing I create a ‘Processed’ folder within the stacked images folder, this is where I will save the sharpened images.

You can see here one of the stacked IR videos with sharpening applied - don’t take the setting shown here as the ones you should use as this is totally dependent on the dataset you are processing, however make sure you don’t over sharpen - less is more!

Registax post.JPG

I repeat the sharpening process for each of the 15 stacked videos, you’ll find the wavelets you’ve selected for IR and Red filtered images will remain pretty consistent. For the blue images you can be a little more aggressive with your sharpening as this is where you will be capturing any cloud detail.

Each sharpened image is saved in the ‘Processed’ directory with the AutoStakkert suffix removed as shown below - this enables them to be imported into WinJupos for the next stage of processing.

sharpened.JPG

De-rotation in WinJupos

Before we begin the WinJupos magic remember what I said about image scale? I shot the videos using a capture area of 640x480 as the planet will fit into this area and reducing the area captured also means you can achieve higher frame rates. Before loading into WinJupos I increase this by 200% using the [File] [Scripts] [Image Processor] menu item in PhotoShop CC. In the image below you can see that I have selected my ‘Processed’ folder, am saving as a TIFF file and have used ‘Resize to Fit’ 1280x960, doubling the image size.

Hit Run and PhotoShop will create yet another sub-directory called TIFF where the re-sized images will be saved, these are the ones we are going to load into WinJupos and process further.

PS file scripts image processor.JPG

People can be a little wary of using WinJupos but really there is nothing to fear, I’ll start with a little refresher.

  • Make sure you choose the correct celestial body (Menu item [Program] [Celestial Body]), in this case Mars.

  • Then choose Menu items [Recording] [Image Measurement] and open the first image in the folder containing the resized images. The file name you saved your images as earlier will automatically set the date and UT values, you may have to input your longitude and latitude.

  • Next choose the Adj tab and use the Up/Down/Left/Right keys on your keyboard to navigate the outline frame to the planet. Make the frame larger using Page Up and smaller by using Page Down and rotate it to the correct north south orientation using the N & P keys.

  • A useful feature for Mars is that in the Adj tab you can display the outline frame with additional graphic which adds the main features visible at the time your image was taken. This serves as a really useful way of correctly aligning the frame.

  • Another feature that is useful in aligning is the ‘LD Compensation’ check box this will show the hidden edges of the planet for fine tuning.

  • Once happy deselect the LD compensation and go back to the ‘Imag.’ tab. There I input the filter used in ‘Image Info’ as this helps when compiling the final image - you should also record the channel in the ‘Adj’ tab.

WinJupos Open Image.JPG

Click F2 or the Save button to save the Image measurement file, I create yet another sub-directory to save these in - shown below. Repeat the process for all the sharpened images making any fine adjustments along the way - you will then end up with five image measurement files for each filter you shot.

WinJupos Save.JPG

Now to de-rotation!

  • Choose menu Item [Tools] [De-rotation of images]

  • Click on Edit and select Add, you then select all the image measurement files for one of your filter sets. You should have five files.

  • Change the LD value to around 0.80, otherwise you may get unwanted artefacts.

  • Select the destination directory (I’ve chosen my ‘Processed’ directory)

  • Make sure you have the correct north south orientation and hit Compile Image (F12) - this will create your de-rotated image for the selected filter set.

  • Repeat this process for all the other filter sets you have generated.

WinJupos De-rotation.JPG

Here are the de-rotated IR, Red and Blue images. This is my final step in WinJupos but if you shot a set of green filter images you can then choose [Tools] [De-rotation of R/G/B frames] to construct an RGB image from your files (an image measurement file will have been automatically created for this purpose.

Winjupos de-rotate.JPG

Processing Your De-rotated Images

You are now on the home straight, its time to create that synthetic green channel I’ve been referring to throughout this post.

  • First open the red and blue de-rotated images in PhotoShop and duplicate the blue image [Image] [Duplicate]

  • Copy the red image and paste it as a layer on the duplicate blue image

  • The red layer should be ‘normal’ and opacity should be changed to around 50%

  • Flatten layers and save as synthetic green. This is the image you will use as your green channel.

PS step 1.JPG
PS Step 2.JPG

Below we can see the RGB image (top right) created by pasting the red, blue and synthetic green images into the relevant channels. Give this image a relevant name (RGB) and save to the directory you are storing your processed images in.

PS Step 3.JPG

For the next stage we leave PhotoShop and go back into Registax. Here we will perform an RGB align and balance. First open your RGB image and choose RGB Align from the functions menu. Make sure the green box covers the whole planet and then click Estimate.

REgistax post process 1.JPG

Next select RGB Balance and click Auto Balance, once done remember to click ‘Do All’ and ‘Save Image’

REgistax post process 2.JPG
REgistax post process 3.JPG

You now have completed processing on your RGB image open it in PhotoShop along with your IR image. There is a bit of debate as to whether you should use an IR or Red image as a luminance layer when imaging Mars, but I think it gives better results for me when imaging from my city location.

Paste the IR image as a layer over the RGB and select luminosity - Save this as a new image named IRRGB.

PS IR RGB1.JPG

You can then perform some final processing on the image, I tend to increase saturation slightly as applying the luminance tends to slightly wash out the image. I also may adjust contrast and colour balance as with the image below and perhaps perform a little more sharpening with an Unsharp mask and some additional noise reduction.

PS IRRGB.JPG

You can perform some final tweaks if you want to improve the aesthetics of the image, below I’ve used crop to increase the black space around the planet and tilted it using Free Transform. Once you’re happy flatten layers as necessary and save the final image. You’re done - well done for getting to the end and happy Mars season!!

PS Final.JPG

Is it time to replace the blue glass in your Lunt solar scope?

If you are the proud owner of a Lunt Ha Solar telescope and have noticed that the amount of detail you are seeing through the eyepiece or when imaging is not what it once was, never fear you aren’t going crazy, it may just be time to replace the blue-glass filter.

When I took my Lunt LS60 THa scope out of mothballs this year in readiness for a summer of solar observing I found that the views I was getting of the Sun were terrible, fine detail was completely absent and prominences were no longer visible. Luckily the issue could be tracked down to one component, the blue-glass filter that sits at the entry point to the diagonal containing the blocking filter.

Blocking filter with tarnished blue-glass

Blocking filter with tarnished blue-glass

 Over time (I’ve had my scope since 2014) humidity can affect the surface of the blue-glass filter causing the tarnishing seen in the images below, I was amazed I could see anything through such a compromised filter, it having turned almost completely opaque. I contacted Lunt through their website www.luntsolarsystems.com and they soon got back to me confirming that they would be able to send a replacement blue-glass filter free of charge via their European distributor Bresser GmbH. Within a few days the replacement filter arrived on my doormat, fantastic customer service, thank you Lunt and Bresser!

Old blue glass filter and the replacement - spot the difference

Old blue glass filter and the replacement - spot the difference

Replacing the filter couldn’t be easier. If the filter is being removed for the first time the you may have to rub off a small amount of silicone, applied by the manufacturer to ensure the filter remains in place during shipping. Next you can use either some fine pointed pliers or, as in my case, a set of compasses to unscrew the retaining ring, as shown in the image below. Once the ring has been unscrewed you can remove the old filter and pop in the new one. When screwing the retaining ring back on keep going until it is finger tight and then go back a quarter turn to allow for expansion as the filter heats up during a solar observing session.

Removing the old filter

Removing the old filter

You can now screw the tube back on to your blocking filter and mount it back in the scope. That’s it, you’re done, it really couldn’t be easier. I’m sure my issues were caused by storing my scope in a humid environment with no silica gel in its flight case. Now it sits in the house rather than the observatory and has a few bags of silica gel around the blocking filter to keep humidity to a minimum. Lesson learnt, but easy to fix thanks to the good people at Lunt and Bresser.

Blocking filter ready for action

Blocking filter ready for action

Something From The Archives!

Back in 1984/85, when I was 17, I wrote some articles for ‘Stardust’, the journal of the Irish Astronomical Association. I’d totally forgotten about this until I found myself rifling through some boxes in the garage of my mother’s house and amazingly unearthed them! They need to be read through the lens of 35 years having passed, but I thought I’d reproduce scanned copies of the articles here for posterity.

 Subjects covered are lunar exploration, the search for extra-terrestrial life and building a Dobsonian telescope, enjoy!

The Search For Extra Terrestrial Intelligence - Autumn 1984

ET cover.jpg
ET Page 1.jpg
ET Page 2.jpg
ET Page 3.jpg
ET Page 4.jpg
ET Page 5.jpg

Exploration of the Moon - Spring 1985

Lunar 1.jpg
Lunar 2.jpg
Lunar 3.jpg
Lunar 4.jpg
Lunar 5.jpg
Lunar 6.jpg

How To Build A Dobsonian Telescope - Spring 1984

Dobsonian cover.jpg
Dobsonian 1.jpg
Dobsonian 2.jpg
Dobsonian 3.jpg
Dobsonian 4.jpg

A Cheat's Guide to Creating Smooth Planetary Animations

In a previous post I detailed how to capture and process high resolution planetary images, but what if you want to do something a little different with that hard earned data? A planetary animation is a really fun way of presenting your images and isn’t as hard to achieve as you might think, though it does require a little perseverance.

The workflow is shown below and described in the remainder of this post.

Planetary Animation Workflow

Planetary Animation Workflow

Choosing a Target

In my guide to planetary imaging and processing previously posted on this site I cover planetary rotation periods and the challenges they present in the capture of high resolution images. When creating planetary animations high rotation rates are no longer your enemy, but your friend.

To gain the best results we want to choose a planet with a fast rotation period and clear surface features, basically Mars and Jupiter. All the other planets either rotate too slowly or do not have enough surface detail to enable movement to be tracked by the eye. One possible exception is Venus, where if you are shooting through IR and UV filters you may be able to show some movement of the cloud layers high in the Venusian atmosphere.

How Many Frames Should I Shoot?

From my planetary imaging post you will see that Mars has a rotation period of 24.6 hours and Jupiter one of 9.9 hours. If shooting Jupiter over a 4 ½ hour period you will be able to capture half a rotation of the planet, and if you time your captures with ingress and egress of the Great Red Spot you will have all the makings of a striking animation. I have found the website Calsky really useful in planning this type of project (www.calsky.com), choose planets from the top menu, then choose the planet you are interested in and then ‘Apparent View Data’, you can then play with the dates and time to see what your best window of opportunity is.

The time of year you will be imaging is also a factor as you will need enough hours of darkness to capture all your frames, something that is a bit of a challenge here in the northern hemisphere as Mars and Jupiter both reach their best at a time when darkness is at a premium! With Jupiter video shot over a period of as little as 30 minutes will still produce a nice result, so you don't have to go the full 4 1/2 hours! 

The method you need to employ to capture your images is described in my guide to planetary imaging and processing, the difference being you will be taking a lot more shots of the same target. Say you aim to capture an image every 5 minutes, if you shoot using a colour camera or through one filter with a mono camera (IR or Red) you will end up with 12 images an hour which will yield 55 images over a 4 ½ hour period. If you wanted to produce a colour rotation video with a mono camera you would end up with 4 times that number of videos to process, a hard drive busting 220 videos! So I stick to mono through one filter, but you may be more dedicated than me!

Gaining a Smooth and Consistent Result

The key to a good rotation video is smoothness and consistency. The more images you have the smoother your result will be, this is where WinJupos comes to your rescue as it will enable you to generate additional images without having to shoot more video. Here are the steps you need to take (details on how to perform these actions can be found in my planetary imaging post):

  1. Capture and process each of you videos as normal (stack, sharpen etc.)
  2. In WinJupos create an image measurement (IM) file for each of the processed images and de-rotate each individual IM file, you should now have 55 WinJupos generated images (for 4 1/2 hours worth of data) .
  3. Now load each contiguous pair of IM files and de-rotate them e.g. load timestamp 00:00:00 & 00:05:00 the result will be the mid-point between the two, if you do this for each adjacent pair you will end up with 54 extra images to add to your rotation video. Almost double what you started with! Now you have a healthy 109 images to make up your rotation video with an effective gap of 2 ½ minutes between each frame which should yield a lovely smooth result.

Once all the images making up the rotation video have been generated and further sharpened it is a good idea to open each adjacent frame up side by side in PhotoShop and check that there are no large variations in levels etc. as this will affect the quality of your final animation. Try to make each adjacent frame as consistent as possible, tedious but well worth it in the end.

Generating Your Animation

Here I'll discuss how to generate your animation using the free software Planetary Imaging PreProcessor (PIPP - https://sites.google.com/site/astropipp/). There are lots of animation applications out there, for example, you can use PhotoShop to create a frame animation, or WinJupos to create an animation using a surface map pasted onto a sphere. Both these methods can yield great results but I'll limit myself to PIPP for now.

Creating a Planetary Animation in PIPP

On opening PIPP you will see a number of tabs I'll go through the relevant ones in order.

Source Files - This tab is defaulted when you open the application. Here you need to click [Add Image Files] and select the frames you want to use for your animation. Make sure they are loaded in order, if they are timestamped they will automatically load in the correct sequence, otherwise you may need to make sure the start frame is at the top of the list and the last frame at the end with all intervening frames in the correct order. Once you've opened your images a pop up will inform you that join mode has been selected, this is what you want so just OK the message. Make sure the Planetary Animation box is ticked, this will optimise the settings in the other tabs automatically.

Choose the images making up your animation

Choose the images making up your animation

Images loaded into PIPP, Join mode selected and Planetary Animation ticked

Images loaded into PIPP, Join mode selected and Planetary Animation ticked

Processing Options - The selections in this tab will be defaulted to those the software determines as optimal for planetary animation, however you may want to alter some of these. I've found that the object detection threshold may need to be amended if your images are quite dim, I also find that it is worthwhile enabling cropping so that you don't end up with a lots of blank space around your subject. There are handy 'test options' buttons which enable you to check that the options you have selected provide the desired result, if not just tweak them until they do.

Processing options with object detection changed and cropping enabled

Processing options with object detection changed and cropping enabled

Animation Options - Here you can select how you want your frames to be played i.e. forwards, backwards, number of repeats, delay between repeats etc. Don't make the delay too long, just enough to show it is cycling through the images again is what you want.

Animation Options

Animation Options

Output Options - In this tab you can choose where you want your animation to be saved, the output format, quality and the playback frames per second. I've found that 10 frames per second gives a nice result, so aim for that if you have enough images, if you have a large number of input images you can go even higher - the higher the frames per second the smoother the final result. You can save as an AVI or GIF. I tend to save my animation as an AVI and convert it to a GIF in Photoshop, I describe how you do this in a later section of this post.

Output Options

Output Options

Do Processing - Once you are happy with the settings you have chosen go to the Do Processing tab and click [Start Processing]. The AVI or GIF will then be generated and saved to the location you specified in the output options tab. If you get a processing error where the object has not been detected just go back and change your object detection threshold and try again. Your planetary animation should now ready to  enjoy and share!

Image processing in PIPP and saved AVI

Image processing in PIPP and saved AVI

Converting Your video to a .gif in photoshop

While you can save your output as a .GIF in PIPP I tend to do this as an additional step in PhotoShop, its not essential but I find it gives a slightly better result. If you don't have PhotoShop or are happy with the output from PIPP then you don't need to perform this step, if you do want to try it here's what you need to do.

  • Open Photoshop (obviously)

  • Go to [File] [Import] [Video Frames to Layers..]

Import video file as layers in PhotoShop

Import video file as layers in PhotoShop

  • When the options box below appears ensure 'Make Frame Animation' is checked. You can also choose whether you want to limit the frames processed, though generally for planetary animations you would want all frames. 

Choose the number of frames you want to include and Ensure Make frame animation is checked

Choose the number of frames you want to include and Ensure Make frame animation is checked

  • Select [File] [Export] [Save for Web (Legacy)…], may just be [File] [Save for Web] in older versions

Go to File, Export, Save for Web

Go to File, Export, Save for Web

  • You can choose various options from the menus within the GIF screen presented, generally I just set the noise settings as shown.

Choose your options from the gif screen

Choose your options from the gif screen

  • Now save and share!

 

 

High Resolution Imaging of the Moon

I often get asked about how I capture some of the images shown here on my website, especially the processes around capturing high resolution images of the Moon. In this post I’ll attempt to cover what I think are the most important or useful tips, hopefully getting you on the road to producing your own high resolution photos of our nearest neighbour. Many of the points covered also apply to imaging planets and the Sun, topics I plan to cover at a later date.

When to Image

There are some things that will have a profound effect on the quality of your image that you aren’t in control of:

  • Astronomical ‘seeing’
  • The position of the Moon in relation to the horizon

If you look up the term astronomical seeing you find the following definition:

Astronomical seeing is the blurring and twinkling of astronomical objects like stars due to turbulent mixing in the Earth’s atmosphere, causing variations of the optical refractive index. The astronomical seeing conditions on a given night at a given location describe how much the Earth's atmosphere perturbs the images of stars as seen through a telescope” Source: Wikipedia

In short the more the stars twinkle the worse the seeing is, it may be a beautifully dark and clear night but if the stars are shimmering away then you may as well forget about high resolution imaging. The effects of poor seeing are even more apparent through a telescope where it can appear as if the lunar surface is situated at the bottom of a very agitated swimming pool!

You can counteract some of the effects of poor seeing by using filters that allow only the longer wavelengths of light to pass onto the camera chip, typically IR pass and red filters (covered later). It also makes sense not to try to image over rooftops in winter or hard surfaces in the summer, the radiated heat causing turbulence, so try to find a spot overlooking fields or a park. If the seeing is very poor just wait for conditions to improve and enjoy the twinkling stars.

As well as using your eyes there are a number of websites that will give a good indication of what seeing conditions you are likely to encounter, a useful tool for planning whether it is worthwhile venturing out for the night. I’ve listed a few below but there are lots more that a simple google search should throw up.

Meteoblue Seeing forecast: https://www.meteoblue.com/en/weather/forecast/seeing/london_united-kingdom_2643743

Unisys weather: http://weather.unisys.com/gfs/gfs.php?inv=0&plot=300&region=eu&t=12h

Even when the seeing is good the position of the Moon in the sky is also going to have an effect on the quality of your images.  When the Moon is low to the horizon it's light has to pass through a greater thickness of atmosphere which causes the same sort of problems as poor seeing.  For best results wait until the Moon is at its highest point (during darkness) and shoot then. In the summer months this can still be quite low so the best lunar images tend to be shot in the winter time, apart from when the Moon is a slim crescent when spring and autumn is best as the ecliptic is at its steepest. 

What to Image

What you image is of course a matter of personal taste, however, anyone who has ever looked up when the Moon is out will appreciate that it is a dynamic object, its appearance changing as the Sun illuminates its surface over the monthly Lunar cycle. The best time to image depends on whether the target you have chosen is a crater or mountain range, a sea or ray structures.

If you want to image a crater or one of the beautiful lunar mountain ranges then it is best to wait until your target is near the terminator (the line between light and dark). The shadows cast will bring the features into sharp relief and best show off their structure.

Copernicus

Copernicus

Lunar seas are best imaged under similar conditions to craters and mountains, a low angle of illumination bringing out features such as rilles, craterlets and domes

Sinus Iridum

Sinus Iridum

Ray structures such as those around the craters Copernicus, Kepler and Tycho are best imaged under full illumination, so wait until the period around the full Moon before tackling these targets.

Kepler & Copernicus inverted rays

Kepler & Copernicus inverted rays

It is also fun to try to image features near the Lunar poles, especially the heavily cratered south. The foreshortening caused by their position really helps to create a sense of depth to the image, giving the impression of flying over the surface of the Moon.

South from Moretus

South from Moretus

Apps are great for choosing when to image and what will look its best on your chosen night. I’ve found ‘Moon Globe’ to be especially useful and would recommend it.

Equipment

Telescope

For high resolution imaging a simple rule applies, the longer the focal length the better (within the limits of your local conditions of course). The issue with imaging at long focal lengths is that the light from your target is dimmed so good light gathering power is also preferable; a large aperture telescope will produce better results than a smaller aperture instrument.

Celestron Edge HD11

Celestron Edge HD11

Even if you have a wide aperture scope with a long focal length it is still unlikely that this will be enough for high resolution imaging, this means you’ll need to employ some form of image amplification via a Barlow lens or Powermate. I use a Televue Powermate and have found the 2.5x (increases your scopes focal length by 2 ½ times) version the best for my set up. This equates to an effective focal length of 7000mm when imaging using my Celestron Edge HD11 scope. At these focal lengths you will also need a good sturdy mount with good tracking as a highly magnified image will quickly slip out of your field of view if alignment and tracking is poor.

It is also worth mentioning collimation as poor collimation will drastically reduce the quality of your images, you can have a big expensive scope but if it is badly collimated your results will be disappointing. If you have a refractor this shouldn't be an issue, if you have a Newtonian reflector it is likely you will need to re-collimate regularly, less so with Schmidt-Cassegrain scopes.

Poor collimation results from improper alignment of the mirrors of the telescope, so in order for light to be properly focused in your eyepiece or camera your primary and secondary mirrors need to be properly aligned. How you collimate depends on your scope so I won’t go into details here but there are lots of resources on the web that will take you through the process.

Camera

For high resolution imaging it isn’t a case of pointing and shooting a single image as you would when taking a selfie or snapping something on holiday. To create a high resolution image we need to shoot a video (.AVI or .SER) which we then process into a single stacked and sharpened image (more later). For best results you should use a high frame rate planetary camera with a sensitive chip. The choice between a colour camera or a mono (black and white) camera is down to how much time and effort you want to spend imaging but I would always recommend using a mono camera as this will produce the best results in terms of image quality and resolution, the whole pixel being used to collect data rather than being shared across a number of colour filters. 

You can still take colour images with mono cameras, all you have to do is image your target through red, green and blue filters and combine them in image processing software to create the combined colour image. This is however somewhat redundant in the case of the Moon as a grey-scale image pretty accurately represents what the eye sees anyway.

In terms of the make and model of camera I would recommend those made by ZWO, I use a ZWO ASI174MM camera and have owned a ASI120MM in the past, these are brilliant little cameras and competitively priced.

ZWO ASI174MM camera with filter wheel & Televue 2.5x Powermate

ZWO ASI174MM camera with filter wheel & Televue 2.5x Powermate

It is worth noting that you can also shoot video using a DSLR camera but as I have no experience of this for high res lunar imaging I'll leave that to others more qualified to discuss.

Filters

Filters are a great way to counteract the affects of poor seeing. For lunar imaging I use a Baader 685nm IR pass filter which really helps to improve image quality, a similar effect can be achieved by employing a red filter. IR filters do create a dimmer image (hence a large aperture being preferable) , a red filter will produce a brighter image which may enable you to image at a higher frame rate without having to increase gain which introduces noise.

Acquiring Your Image

The atmosphere is still, the Moon is high, the phase is right and you have all the gear, congratulations, now you can start acquiring your images!

Planetary cameras are really just a chip so you’ll need to link it to a laptop with image acquisition software installed in order to capture your video. There are really only two applications I would recommend for this FireCapture and SharpCap. Both are free with FireCapture being the tool of choice for most planetary imagers as it offers the most functionality. I actually prefer the pared down SharpCap but really you should try both and see which one works best for you and your equipment, they are both free so you don't have to worry about cost. 

I won't go into all the ins and outs of how to use these applications as they both has very good help facilities and are pretty user friendly, I'l just cover some basic points I have found useful.

FireCapture: http://www.firecapture.de/

SharpCap: https://www.sharpcap.co.uk/

Focus

It seems obvious but it is worth stressing how important good focus is, you can have everything in place and conditions may be perfect but if you haven't taken the time to focus properly you may as well have stayed indoors on a nice warm sofa. I find the best approach is to find a prominent feature and gently move the image in and out of focus until you find the sweet spot, you'll also need to bear in mind that the point of focus will change as the temperature changes throughout the night so you'll regularly have to re-focus for the best image. The point of focus will also change when imaging through different filters so you'll need to slightly change focus on each change from red to green to blue filter.

SharpCap and FireCapture both offer tools that help with focusing and if you want a lower tech approach a Bahtinov mask is very helpful.

Framing

As you are capturing a highly magnified image make sure it is in the centre of the frame of your video, it can take a little while to capture the frames you want if the frame rate is low and the last thing you want is for your target to drift out of shot. If you plan to stitch together a panorama consisting of a number of shots make sure there is plenty overlap in your frames, there's nothing worse than taking time to produce a panorama only to find when you try to process it you are missing a thin section from the middle.

Shooting

Generally I aim to shoot around a 2000 – 3000 frame video for later stacking, as the Moon is a bright object you can reduce the gain of your capture (think of it as ISO in a normal camera) which will reduce the noise in your image meaning the number of frames you need to stack (see later) to produce a low noise result will be quite low.

As you reduce gain you will have to increase your exposure, but take care not to over expose your image. SharpCap has a handy feature called 'Highlight Over Exposed' which causes any overexposed parts of the image to flash, all you have to do is reduce the exposure or gain until this stops. You can also make use of the Histogram function within your acquisition software, you need to make sure the graph does not continue all the way to the right hand side, aim for 80% (see the image below). It is also worthwhile keeping an eye on the number of frames per second you are capturing, I would recommend around 30 fps as a minimum for Lunar imaging.

FireCapture imaging in progress

FireCapture imaging in progress

Processing Your Image

You now have your precious .AVI or .SER file so the fun bit of creating your final image can begin. Image processing consists of stacking and sharpening the video you have acquired and to do this there are two fantastic applications that pretty much every lunar, solar and planetary imager uses, AutoStakkert!2 and Registax6. They also have the added bonus of being free.

The whole stacking and sharpening workflow can be done using Registax, however I find that AutoStakkert does a better job of analysing and stacking so I use it for this and Registax solely for sharpening the image.

The image below shows the initial screen you will be faced with in AutoStakkert once you have opened the video you wish to process. For lunar images make sure you have checked 'Surface', you will then see a green box appear, place this on the main area of interest in your image, if you want to make it larger click Alt-9, then click 'Analyse'. This will perform a rough alignment and order the frames of your video in order of quality. When analysis is complete a graph will appear which ranks quality against percentage of frames and also shows how much each frame differs, this will help you assess how many frames to stack.

Autostakkert Load and analyse screen

Autostakkert Load and analyse screen

Once the video has been analysed you need to place the alignment points on the image. For lunar images I go for a relatively small size, 64 pixels in the example below, click on 'Place AP grid', this will place alignment points across the image. One point worth noting is that you shouldn't have any align points in the very dark area of the image such as the shadow within the crater Copernicus (below) as this will confuse the software and you won't get the best result.

Autostakkert alignment points

Autostakkert alignment points

As you have shot your images with low gain you will not need a large number of frames to produce a smooth sharp result. In the example below I've only chosen the best 200 frames of a 2000 frame video to stack, but as the video was pretty noise free to begin with we can be very selective. With planetary imaging you would typically stack a much larger number of frames as your video will have been shot with a much higher gain and will be correspondingly much noisier.

Autostakkert stack settings

Autostakkert stack settings

Now you need to sharpen your image, for this you are going to use Registax6. Open the stacked TIFF produced by AutoStakkert in Registax and you will be presented with the screen below. The wavelet sliders on the left hand side are what you will use to sharpen your image, with sliders 1 - 6 acting on progressively finer detail. If you have good quality data you will probably be able to get away with only adjusting slider one, as shown below. If you are too aggressive with the sharpening it will be very apparent so less is more in this, don't go for a super sharp image as this will look false and artifacts will be introduced. 

The degree of sharpening and noise reduction can also be controlled by adjusting the values in the boxes above each slider so it's worth experimenting with these also. If you are feeling brave have a go at checking the use linked wavelets box, for this you will definitely need to make use of the 'Denoise' boxes. The best thing is to experiment with the sliders as every image is different, but remember don't over sharpen.

Once you are happy with the result click 'Do All' and then 'Save'.

Registax 6 - sharpened image

Registax 6 - sharpened image

AutoStakkert - for stacking: http://www.autostakkert.com/

Registax for sharpening: https://www.astronomie.be/registax/

Final Steps

You're almost there, just some final tweaks to perform and you have your image. For final processing I use PhotoShop CC but you can also use free software such as GIMP for these actions as there is little you need to do to the image you have already processed via the steps described previously.

Really there are only two things you need to do, crop your image and adjust the levels. Cropping is needed as stacking generally creates a ragged edge to your processed image, an artifact of the alignment process, so at the very least crop away the edges of the image to clean it up. For lunar images I also tend to lighten the mid tones via the levels layer. You can also change the aspect of your image by rotating it and create mosaics (remember to mosaic after cropping but before adjusting levels etc.). You shouldn't need to perform any further sharpening or noise reduction as this will have already been done. That's it you can now sit back and admire your handiwork!

Image cropped in Photoshop and levels adjusted

Image cropped in Photoshop and levels adjusted

Astronomical Close Encounters - A Matter of Perspective

There are a few astronomical events that are sure-fire crowd pleasers, guaranteed to catch the attention of astronomers and the general public alike. Who can forget the Chelyabinsk meteor, or not be blown away by a beautiful display of aurora? The problem is it’s hard to predict when and where the next sizeable lump of space rock is going to burn up in our atmosphere and aurora, though more predictable, are in general confined to latitudes more northerly than those of the majority of the UK - certainly for the spectacular displays.

Here I’m going to concentrate on equally spectacular events that are totally predictable and therefore accessible to anyone who has patience or a willingness to travel - eclipses, transits, occultations and conjunctions. Before continuing some simple definitions might prove useful:

  • A conjunction occurs when two astronomical bodies (two solar system objects, or one solar system object and a star) have the same right ascension when observed from Earth; this basically means that they appear close to each other in the sky. The key point here is that they appear close to one another, this is, as the title of this post suggests, merely a matter of perspective. Of course very rarely you can get conjunctions where this isn’t the case, comet C/2013 A1 Siding Spring and Mars not only appeared close in the telescope eyepiece in the second half of October 2014 but were also physically very close to one another (astronomically speaking), scarily so for any Martians out there!
  • If the bodies have the same declination at the time of conjunction then the one that is closer to Earth will pass in front of the other and syzygy takes place. If a smaller body passes behind an apparently larger one this is an occultation, where the smaller body passes in front of the larger one this is a transit and when the transit or occultation is between the sun and the moon then this is an eclipse, simple. Of course now we can also detect transits outside our solar system, the transit method being an incredibly useful tool for the discovery of exoplanets, but let’s confine ourselves to events closer to home for now.

Eclipses

No other astronomical event can rival a total solar eclipse for pure spectacle. It is a wonderful coincidence that the Moon is 400 times smaller than the sun but 400 times closer to the Earth (it varies hence annular eclipses which I’ll cover later). When alignment is favourable the new Moon passes directly in front of the sun from the point of view of an observer on the Earth’s surface causing the Moon to completely cover the Sun’s disc resulting in a total eclipse and the beautiful corona making up the Sun’s outer atmosphere being revealed.

Image Credit: Rice Space Institute

Image Credit: Rice Space Institute

I can remember travelling to Cornwall in 1999 to witness a rare total eclipse visible from mainland UK, something I’d been looking forward to for years. Unfortunately as everyone in the UK knows totality was clouded out apart from a few lucky souls who managed to find themselves under a brief gap in the clouds. Despite the huge disappointment even the cloud diluted experience was totally unforgettable, sitting on a beach watching the sky darken and the birds roosting was totally surreal and an event I’ll remember for the rest of my life. The next total eclipse visible from the UK is in 2090, one I will certainly miss but maybe one for my kids. If you can’t wait that long, and didn’t make it to the States to witness the eclipse in August 2017 then make your way to Spain in August 2026, or witness a large partial from here in the UK. If you want to find out when the next eclipses are from your location or worldwide then www.timeanddate.com is a good resource.

The closest I've come to witnessing a solar eclipse!

The closest I've come to witnessing a solar eclipse!

Of course the Sun and the Moon don’t just provide total eclipses for us to enjoy. When the moon is at a point in its orbit where its apparent size is not large enough for it to cover the entire face of the Sun (apogee) an annular eclipse occurs in which the outer edge of the Sun is still visible creating the famous ‘ring of fire’. It’s a long wait for the next one of these visible from the UK but we can console ourselves with partial eclipses and here the odds are much more favourable. I have seen a few from the UK and listed those up to 2028 below.

Partial Solar Eclipses visible from the UK 2018 - 2028 - 10th June 2020, 25th October 2022, 29th March 2025, 12th August 2026, 2nd August 2027, 26th January 2028

 Of course the Moon also experiences eclipses, where the Earth passes in front of the Sun from the point of view of anyone fortunate enough to be standing on the lunar surface. These only occur on the night of a full moon and unlike solar eclipses are visible from anywhere in darkness with the moon above the horizon while the eclipse is taking place. Also as the Earth casts a much larger shadow than the moon lunar eclipses last for hours rather than the few minutes of totality experienced on Earth during a Solar eclipse.

Lunar eclipses come in three forms, total, penumbral and partial. Total eclipses occur when the Moon is completely within the darkest (central) portion of the Earth’s shadow – the umbra. A penumbral eclipse occurs when the moon passes through the outer regions of the Earth’s shadow – the penumbra. This only partially blocks the Sun, resulting in the portion of the moon in shadow becoming slightly darker and can be quite subtle. A partial lunar eclipse occurs when part of the moon passes into the Earth’s umbra.

Image Credit: Rice Space Institute

Image Credit: Rice Space Institute

At the point where the Moon is completely within the Earth’s umbra it takes on a reddish appearance due to the scattering of light by the Earth’s atmosphere, the same process that gives us our beautiful red sunrises and sunsets. Interestingly the more dust in the atmosphere the redder the Moon will appear. The last total lunar eclipse visible from the UK was in September 2015 when the sky was amazingly clear over London.

The Moon and the stars, a few moments before totality

The Moon and the stars, a few moments before totality

Lunar eclipse composite

Lunar eclipse composite

Lunar eclipses visible from the UK over the next 10 years are listed below.

  • Total Lunar - 27/28th July 2018, 16th May 2022, 7th September 2025, 31st December 2028
  • Penumbral - 10th January 2020, 5th June, 5th July 2020, 25th March 2024. 20/21 February 2027, 6th Jul 2028
  • Partial Lunar - 16/17th July 2019, 19th Nov 2021, 28th Oct 2023, 18th Sep 2024, 14th Mar 2025, 28th Aug 2026, 12th Jan 2028

Transits

Whenever considering transits it is usually the transit of Venus that immediately springs to mind. Transits of Venus loom large in scientific history and in terms of pure adventure surely nothing can compete with the first voyage of Captain James Cook whose aims were to observe the 1769 transit of Venus from Tahiti thus enabling the accurate measurement of the distance from the Earth to the Sun, and perhaps as an afterthought, confirm the existence of the land mass we now call Australia.

Venus resembles a perfectly formed roaming sunspot as it transits, the fact that you are seeing a planet describing it’s orbit being played out before you in real time makes this an incredible event to witness. Transits of Venus are also precious for their rarity, consider the transit of 1631, if you missed that you only had 8 years to wait until the next one in 1639. Miss that however and it would be a 122 year wait until the next 8 year cycle in 1761/1769, odds then become more favourable if you miss these as there was ‘only’ a 105 year wait until the next in 1874, then another 122 year gap and so on. I’m sure many of you witnessed the transits in 2004 and 2012, unfortunately I didn’t (clouds) and fear I won’t be around for the next in 2117 regardless of the fact that it is one of the shorter gaps in the cycle!

Not to be disheartened there is also the support act of transits of Mercury. Here the odds are much better owing to the fact that Mercury orbits much closer to the sun giving it a much shorter orbital period, tis means everything is played out in fast forward. There are roughly 13 to 14 transits of Mercury per century so there is always a good chance of seeing at least one in your lifetime. These transits occur in May or November with the last one occurring on May 9th 2016, luckily the skies cleared for a few hours enabling me to capture the images below. The next three transits will occur in November 2019, 2032 and 2039 before the next May transit in 2049.

Mercury and AR2542

Mercury and AR2542

Mercury transiting the Sun - white light, false colour

Mercury transiting the Sun - white light, false colour

If you are impatient and transits of Mercury are still too rare for you then Jupiter comes to the rescue. The four Galilean moons, Io, Europa, Ganymede and Callisto regularly transit across the face of the planet, in fact you can also witness occultations and even eclipses where one moon may move in front of another. There are a number of web resources that are incredibly useful in planning when best to view Jovian transits such as www.calsky.com, planetarium software and the astronomy press. These events make a great imaging opportunity where the shadow of the moon, or moons cast a shadow on Jupiter’s cloud tops. You can also make a time-lapse showing the Galilean moons dance around the gas giant.

Transit of Europa across the face of Jupiter

Transit of Europa across the face of Jupiter

We’re not finished there however as with luck you can also witness the International Space Station (ISS) transiting the Sun or Moon. I’ve found two web resources invaluable for determining when a transit is going to be observable from my location www.calsky.com and www.heavens-above.com. I’ve covered this in detail in a separate blog post, available here http://www.thelondonastronomer.com/it-is-rocket-science/2017/6/26/imaging-and-processing-solar-and-lunar-transits-of-the-international-space-station

ISS passing in front of the Sun

ISS passing in front of the Sun

ISS Solar transit composite

ISS Solar transit composite

Occultations

If you want to observe an occultation then it’s best to aim your gaze towards Jupiter and its moons. As mentioned earlier an occultation of one of the main Galilean satellites by Jupiter is fairly common and easily observable with the equipment available to most amateur astronomers. You can plan your observing using the resources mentioned previously.

Putting Jupiter to one side, when most people think an occultation it is the Moon that plays the part of the larger body and a planet, minor planet or star that is occulted. For a star to be occulted it needs to be close to the ecliptic with the bright stars Regulus, Spica, Antares and Aldebaran being in positions where the Moon may pass in front of them. The Moon may also occult The Beehive Cluster (M44) and The Pleiades (M45).

Aldebaran occultation composite

Aldebaran occultation composite

An occultation of a planet however is what is really required to be considered a ‘crowd pleaser’. These occur surprisingly often from somewhere on the Earth’s surface, as both the Moon and the planets inhabit the area around the ecliptic and so come in to contact with one another on a regular basis. A great resource for occultation data is the USNO On-line Astronomical Almanac http://asa.usno.navy.mil/SecA/olist18.html well worth checking out to see if there is a notable event visible from your location, you can also keep an eye on the astronomy press..

Conjunctions

Finally we have conjunctions, although you might consider them the poor relation of the occultation, a celestial near miss, they can be every bit as spectacular as any of the other events I’ve described and when a conjunction is between two bright planets it can stir up a fair bit of media interest.

Mars, Jupiter conjunction

Mars, Jupiter conjunction

We also have the situation where one person’s occultation is another’s conjunction. An occultation by the Moon of Saturn for example will only visible from specific points on the Earth’s surface, but anyone near those points will have seen a conjunction of varying proximity depending their location. In the UK we’ve had some beautiful conjunctions to enjoy where the Moon has made a close pass of Venus, Mars, Saturn and even distant Uranus. As always with these events use the resources out there to make sure you don’t miss a great opportunity to witness or image a conjunction for yourself.

Moon and Venus conjunction

Moon and Venus conjunction

All the events I’ve described, while no longer offering profound scientific insight still have the power to fuel the imagination and trigger a life long interest in the heavens, so get planning and make sure to persuade your friends and family to take the time to look up and enjoy the spectacle for themselves, you never know you might make a convert.

Astronomy Under City Skies or Combatting The Curse of Light Pollution

The question I get asked more than anything else when the subject turns to astronomy is “How can you possibly do astronomy or astrophotography from London when there is so much light pollution?” Here I’ll attempt to answer that question as well as hopefully inspire others who live in or near a city (which is most of us) to give the wonderful hobby of astronomy a go.

As you can see from the map below London is one of the most challenging places in Europe from which to view and image the night sky, so if I can do it anyone can!

Image Credit - SCIENCE ADVANCES http://advances.sciencemag.org/content/2/6/e1600377

Make the most of your location

Whether because of work commitments, family ties or just financial reality most of us don’t have the option of moving to a dark sky site in order to pursue astronomy or astrophotography as a hobby, so what can you do to make the most of your light polluted urban location? Here are a few tips that might help, some are pretty obvious, but I think worth stating anyway.

  • Make sure you are in shadow. Streetlights and neighbour’s bright lights are a universal problem in cities and towns, I’m constantly plagued by a kitchen light in neighbouring apartments that is so bright it illuminates the whole of my garden, I swear they must have to wear sunglasses to cook! I’ve planted some trees to obscure it but it will be a few years before they are going to be totally effective, so the best thing to do in these situations is try to set up in the shadow of a wall, tree or large bush, taking care that you can still see your intended target. If you can’t find that perfect spot then you could try channelling a Victorian photographer and cover your head with a dark cloth or coat. This will enable your eyes to become dark adapted (as long as it doesn’t slip off) and will help you tease out fainter details at the eyepiece.
  • Shade your optics. If you can’t get yourself into shadow, or don’t fancy putting a hood over your head, try to shield your optics from stray light. Dew shields are a good option, you can either buy one or make one yourself, they serve the dual purpose of cutting out light and increasing the length of time your scope can sit outside before water condenses on your optics, extending the time you have for observation or photography.
  • Image/observe when conditions are at their best. Pollution, dust and water vapour in the atmosphere all help to make light pollution worse as more light will be reflected back from all those streetlights etc. Poor conditions are generally found when humidity is high or it has been dry for a long period resulting in increased dust in the air. When you hear the term transparency this is usually what is being referred to, it’s also one of the reasons why observatories tend to be located in dry, lofty locations. For astronomy we also need stable conditions with low wind speeds, both at ground level and at the level of the jet stream. Astronomers call the variations in atmospheric conditions ‘seeing’ the better the ‘seeing’ the better the view you will have. A simple way of judging this is to gauge how much the stars are twinkling, shimmering stars may look pretty but they signify poor conditions for astronomy as they are the result of a turbulent atmosphere. Seeing is especially important for high resolution photography regardless of your location. I’ve included links to some useful websites at the end of this post which will help give you an idea of when conditions out under the stars will be at their best.
  • Plan your observing in advance. Planning is key, from any location you will get the best views of a target when it is highest in the sky, this is even more important when observing or imaging from a city or town. By waiting until your target is at its highest you will minimise the amount of atmosphere you are looking through and lessen the impact of light pollution. There is also no point trying to view dim objects under a full Moon, make sure you know your Moon phases and plan accordingly. When the Moon is big and bright image and observe it rather than try to battle against it.
  • Choose targets least affected by Light Pollution. A simple rule is that the brighter the object the less affected it will be by light pollution, pretty obvious really. Light pollution will have zero impact on lunar and solar observation and planetary observing and imaging is also pretty much unaffected, I’ve been able to view and photograph Neptune from my Wimbledon back garden. When observing and imaging planets atmospheric conditions are the limiting factor not stray light. Globular and open clusters are also a great option when observing under compromised skies but visual observation of galaxies and nebulae will be challenging, having said that big bright nebula like M42 the Orion nebula will still be visible under light polluted skies as will M31 the Andromeda Galaxy.
  • Foster neighbourly relations. Security lights are great generators of unwanted light - it is after all their prime purpose! If you have them then make sure they are switched off when you are observing, if your neighbours have them get them over and introduce them to the wonders of the universe….then ask them if they could turn off their lights while you are indulging in your nocturnal hobby!
  • Become a night owl. As the night wears on the amount of stray light decreases as people go to bed and switch off lights, also some local authorities switch off street lighting after midnight to save cash. The later you can do your observing the better.
  • Speak to your local authority. If you have a street light that shines into your garden it may be worth while bringing this up with your local authority, if it shines into a bedroom even better as this will help you build a case for them to install shielding to reduce the amount of light falling on your property. A polite enquiry can work wonders sometimes.
  • Escape to the country. I know these are supposed to be tips about making the most of your location but sometimes it’s nice to pack up and go somewhere truly dark – just to re-charge your astro batteries and remind yourself of what a truly dark sky can deliver.

Fight Light Pollution with Filters

Of course as well as the ‘no cost’ options described above you can also throw science and money at the problem by employing filters. Filters specifically designed to reduce the effect of light pollution go by a number of different names, City Light Suppression (CLS), Anti-Light Pollution (ALP), Light Pollution Suppression (LPS), Ultra High Contrast (UHC) etc. They all work by tuning out the wavelengths emitted by common street lights and can make a big difference to both the eyepiece view and that obtained via imaging. There is however some bad news on the horizon in that the move to LED lighting by many local authorities will make light pollution filters less effective as they emit over a much broader spectrum.

My Canon 6D with Astronomik CLS filter fitted

My Canon 6D with Astronomik CLS filter fitted

It is safe to say that there is a fair bit of debate about the effectiveness of these types of filters for visual astronomy but in my experience CLS filters are very effective when imaging from my London location making wide-field, long exposure DSLR shots viable. Have a look at the images below. Both were taken with the same exposure and while balance settings, the only difference being that the image on the right was taken with an Astronomik CLS filter in place. I haven't done any processing on either of the images so that a direct comparison can be made. The difference in the sky background in the image on the left and that on the right is obvious with the sky in the unfiltered image getting close to swamping out any detail while the filtered image on the right would benefit from a longer exposure. Both images could be improved significantly with processing, not least corrections in colour balance etc.

Both Images were taken with a Canon 6D, 20 second exposure at ISO1600. The image on the left was taken with no filter and white balance set to daylight. The image on the right was taken with the same settings but with a CLS filter in place. The bigh…

Both Images were taken with a Canon 6D, 20 second exposure at ISO1600. The image on the left was taken with no filter and white balance set to daylight. The image on the right was taken with the same settings but with a CLS filter in place. The bight blob is globular cluster M2

A few points regarding light pollution filters are (I think) uncontroversial and worth listing.

  • Filters work by subtracting (filtering out) particular wavelengths of light. This will make objects dimmer, but will, if the correct filter is used, improve contrast which will make faint nebula easier to see. By blocking out the wavelengths of light emitted by certain common artificial lights but allowing through the emission lines of the nebula you are viewing the sky will appear darker and the nebula brighter as contract will be improved. This also has the advantage of allowing longer exposure times for astrophotography as the sky background will take much longer to swamp the image of the object you are photographing.
  • The effectiveness of light pollution filters varies depending on the object you are viewing. If you are viewing a source that emits across a broad range of wavelengths such as a galaxy or reflection nebula then the filter will not be as effective as when viewing an object that emits at a narrow wavelength, such as H-Alpha or Oiii for emission nebula. An urban location is definitely much more of a barrier to achieving good results for galaxies and reflection nebula, it may be that only a huge number of stacked un-filtered ‘short’ exposures will give you the result you crave!
  • Light pollution filters will not block out all artificial light sources. They are not a substitute for truly dark skies.
  • Filters will increase the time required to image an object. This may not be strictly true in all cases but in my experience it is certainly desirable to increase image times if possible.

It is also worth noting that when imaging using light pollution suppression filters some colour shift will occur, some filters producing a more noticeable cast than others. This can however be easily remedied by creating a custom white balance for your camera with the filter in place. As with most things you get what you pay for, the limitations described above are less marked the higher up the price scale you go. Ultimately these filters let you make the most of your location, at least until you can afford that mountain top bolt hole.

Go Narrowband

Why not throw even more money and science at the problem. A great solution for the astrophotographer is to image using narrowband filters, the most common being those that transmit the emission lines of H-Alpha, Oiii and Sii. With these three filters you can try to re-create the famous ‘Hubble palette’, utilised by the Hubble Space Telescope for many of its amazing images.

These filters work really well on emission nebula, enabling you to image even very faint targets from an urban environment. They work by capturing very specific wavelengths of light, typically those dominant in star forming regions (Orion Nebula, Lagoon Nebula), and the gas emitted by exploding stars such as planetary nebula (Dumbbell nebula, Ring Nebula) and supernova remnants (Veil nebula, Crab Nebula). The bad news is that if you want to image the beautiful blue nebulosity around the Pleiades then narrowband filters do not work well, this nebulosity is lit by reflection from the nearby stars and so is a broadband source.

Here’s a brief summary of the emission lines passed by the most common filters:

Hydrogen-Alpha, H-Alpha (656.3nm) – This is most dominant in star forming regions such as the Orion Nebula. It is in the red part of the spectrum which is why on standard RGB colour images these types of nebula are red in colour.

Oxygen-III, Oiii (500.7nm) – This emission line is in the blue green part of the spectrum and is dominant in planetary nebula, the Dumbbell nebula shows up incredibly well when this type of filter is used.

Sulphur-II, Sii (672.4nm) – This emission line is well into the red part of the spectrum, it is generally weak and requires a lot of exposure time to generate a decent result. In bi-colour imaging this filter is typically left out but is included as you will want to use it if your desire is to faithfully re-create the Hubble palette.

The positions of these emission lines is shown in the graph below.

Image Credit Starizona.com

Image Credit Starizona.com

The Horse head nebula, imaged from London using a H-Alpha filter

The Horse head nebula, imaged from London using a H-Alpha filter

Embrace the Power of Stacking

If you want to take pictures under light polluted skies but can’t afford, or don’t want to use filters there is another solution, stacking.

The key drawback to imaging under light pollution is the fact that longer exposures cause the sky background to wash out or overwhelm the dimmer or more subtle objects and features that would be visible under less polluted skies. The solution is to keep your exposure times within the limit dictated by your sky brightness, you can work out what this is by taking a number of exposures of varying length and checking the image histogram or by visual assessment. When using the histogram you basically don’t want the peak of the graph to be right over to the right and you definitely don’t want to be touching the right hand edge of the graph as this indicates overexposure and total loss of data for that part of the image.

Once you’ve established how long an exposure you can get away with take lots of short exposures that you can then stack (you’ll need a tracking mount and intervalometer). This increases the signal to noise ratio and will enable you to tease out faint detail in post processing, even under light polluted skies.

The subject of image stacking and processing would require a book to cover properly and as this post isn’t about image processing I’ll leave the details of theory and work-flow to another time. If you do want to investigate this further there are plenty of resources out there on the web along with free software such as Deep Sky Stacker (DSS) which will do a lot of the hard work for you.

Give up and Go Robotic

All is not lost if you still want to image and process your own images of the night sky but don’t have the heart for the fight against light pollution. Robotic imaging offers you the chance to use amazing telescopes sited under some of the best skies on the planet, all from the comfort of your home. I wrote and article about it a few years ago which can be found here (it says written by Ralph, but it was actually me!).

https://www.awesomeastronomy.com/articles/149-dawn-of-the-robots-or-how-robotic-imaging-saved-my-sanity

Useful Websites and Apps

Weather and seeing

http://www.metcheck.com/HOBBIES/astronomy_forecast.asp?LocationID=3867

http://weather.unisys.com/gfs/gfs.php?inv=0&plot=pres&region=us&t=4p&expanddiv=hide_bar

Clear Outside – Great app from first light optics

Planning your targets

There are loads of astronomy applications out there but if you want something to get started with I would recommend Stellarium. I’ve found for planning it is best to use something on your desktop or laptop rather than phone or tablet, having said that there are lots of great apps available for your smartphone. I’ve found ‘Starwalk’ and‘Moon Globe’ particularly useful.

There is quite a comprehensive list at Astronomy Online which is worth a scroll through http://astronomyonline.org/AstronomySoftware.asp

 

Imaging and Processing Solar and Lunar Transits of the International Space Station

Have you ever watched the International Space Station pass overhead on a clear night? If you haven’t you should, but this blog isn’t about witnessing the ISS as it zips across the sky, this is about rarer and more spectacular events.

Occasionally your location on the Earth, the orbit of the ISS and the position of the Sun, Moon, a star or planet conspire to perfectly align and when that happens you get an occultation (where the ISS completely blocks out the light from a star or planet) or a transit where the ISS passes directly between you and the Sun or Moon. Here I’ll try to share with you a few of the things I’ve learnt from successfully, and unsuccessfully, imaging these events over the years.

Planning your shot

Obviously you can’t just randomly choose a time to image the Sun or Moon and hope that the ISS passes across your field of view, you need to plan, and luckily help is at hand in the shape of the rather excellent website Calsky https://www.calsky.com. It is a fantastic resource, which I find invaluable for the planning of observations and imaging and is, in my opinion, the best and most accurate resource for the prediction of ISS transits.

The width of the ground track where a transit of the Moon or Sun is visible is only around 13 km, the middle of the 13 km band having a transit across the centre of the Sun or Moon, at the limits of the band of visibility the ISS would appear to graze the edge of the object. This fact makes it very important that Calsky knows exactly where you are on the surface of the Earth, especially for the prediction of occultations where the track of visibility may be measured in metres.

Calsky Location information

Calsky Location information

Your location is shown at the top right of the Calsky main page. To change it click on the location it has chosen for you (you can also save locations etc., but we’ll do it from scratch).

Setting your location in Calsky

Setting your location in Calsky

Enter the postcode, zip or address of your chosen observing location and Calsky will bring up a map where you can further refine your positioning. Just click where it indicates to set this as the location from which it will base its calculations.

Now navigate to the Satellites/International Space Station ISS pages of the Calsky website.

Choosing your calculation period in Calsky

Choosing your calculation period in Calsky

Select the period you want the calculation to cover, the start time defaults to the current date and time, I like to select 2 months for duration as I can then see any upcoming events. It should be noted that the further into the future the prediction is, the greater the uncertainty. A transit may be predicted but may not happen as the ISS has to adjust its orbit every so often, conversely a transit may happen where previously it hadn’t been predicted. My advice would be check the site regularly and always check the day before and on the day of the transit you are planning to image so that you have the most up to date and accurate information to hand.

Narrowing the calculation to transits and close passes in Calsky

Narrowing the calculation to transits and close passes in Calsky

To only see predictions for transits and close passes un-check the Show satellite passes box and make sure the Close fly-bys box is checked and the minimum angular separation is set. You will then get a series of predictions for when the ISS is going to satisfy the parameters entered from your location.

Your returned calculation - yay a transit!

Your returned calculation - yay a transit!

Here a transit of the Sun (which I managed to image) is predicted to occur at 16:52 and 32 seconds on the 18th June 2017. This is what you need to put in your diary!

Capturing the moment

I’ll assume you are using a planetary camera to image your chosen transit and capturing a .AVI or .SER video file, you can shoot using video mode with a DSLR or even shoot high frequency single exposures, but I’ve no experience of this so I’ll leave that to others to explain. I use SharpCap for planetary camera control but what I’ll cover here is equally relevant for FireCapture or any other camera control software.

The benefit of an ISS transit over imaging the ISS as it flies overhead is that you can set up, focus, and sort out your exposure well in advance. Lets face it, it’s pretty easy to find the Sun or the Moon, and if you have a tracking mount you can be sure the spot in the sky where the ISS is going to pass will stay in your field of view.

If your set-up doesn’t allow you to capture the full face of the Sun or Moon, or you want the ISS to appear at a larger image scale than this would provide then it really does become important to know exactly where the ISS is going to pass in front of your chosen object, nothing is more frustrating than having the camera rolling and seeing the tip of a solar panel pass at the extreme edge of your field of view!

Luckily Calsky comes to the rescue here as well providing a visualisation of the path of the ISS across the face of the Sun or Moon, just make sure you are capturing at least some of the area indicated with some wriggle room either side and you should be fine. Also make sure you know whether your scope inverts the image and in which plane.

Track of the ISS across the face of the Sun

Track of the ISS across the face of the Sun

For the Sun you need to set your exposure as you would for normal solar imaging, the ISS is going to appear is silhouette so you don’t need to consider it’s relative brightness – its always going to be darker than the Sun! For Lunar transits you may need to consider how bright the ISS is going to be relative to the part of the lunar surface it will be passing over. Check brightness when transiting on Calsky and keep your exposure short if you want to capture detail on the ISS and not over-expose. If you expose as you would for imaging Venus then you won’t be too far off, but err on the side of caution – you aren’t going to get a second chance, at least not without a bit of a wait.

Try to get as high a frame rate from your camera as you can by adjusting gain etc. the ISS moves very quickly especially if overhead so a slow frame rate will restrict how many frames containing the ISS you will end up with in your capture video.

When it comes to recording the capture video I generally start about a minute before the predicted transit time, though the Calsky predictions are accurate enough to reduce this if you want. A very important point is to make sure that you set a recording time limit that covers the period the transit is predicted for, don’t set a frame limit. If you can set your capture time limit to infinite even better, you can always stop recording once the ISS has passed. I have learnt the importance of this point through bitter experience, watching the ISS occult the star Aldebaran and congratulating myself on having captured it only to discover that the camera had stopped capturing after 3000 frames, stopping just before the main event!

Processing the results

So now you have managed to capture the transit you need to be able to process the results to show it off to its best effect and let others share the magic. There are lots of different options when it comes to processing, but I've described here the workflow that works best for me. Feel free to use whatever software you feel most comfortable with. 

You will of course have watched your capture video back and confirmed that you have indeed got some frames there the ISS is seen moving in front of the Sun or Moon, you’ll also have noticed that the vast majority of the video contains no hint of the ISS (unless you are an adrenaline junkie who started recording just before the predicted transit time and stopped just after!). You can keep the capture video as is but I’ve found for extracting the frames where the ISS features life is made easier by editing a copy of the video down to the frames you want to stack as a composite ISS track.

There are lots of applications out there that will enable you to do this but I’ve found one of the easiest to use is a free application called VirtualDub. It can be downloaded from here http://virtualdub.org/download.html and allows you to perform basic editing on your video with minimal stress.

In the image below I have opened my ISS transit video and used the cropping buttons (highlighted) to indicate the frame range I want to keep in the copy output video. All you need to do is perform a File/Save As and it will save a cropped copy of the input video in your desired location.

The video processing screen in VirtualDub

The video processing screen in VirtualDub

The edited version of my capture video can be seen here - real time so blink and you'll miss it!

Once you have edited a version of your capture video down to the frames of interest you can then extract them as individual images. For this there are again a large number of options, but I’ve gone down the free software route and use Planetary Imaging PreProcessor (PIPP). This can be downloaded from here https://sites.google.com/site/astropipp/downloads

In the image below you can see the PIPP main screen with the edited ISS AVI loaded.

Load you video in PIPP

Load you video in PIPP

Go to the Output Options tab and choose an output format of .TIFF

Make sure you save your output as .TIFF

Make sure you save your output as .TIFF

Once you have selected Do Processing the images will be saved to the folder of your choice as shown below

Your extracted single frames

Your extracted single frames

Now the fun really begins. To create a composite of these images I use Photoshop Creative Coud taking advantage of the  File/Scripts/Load Files into Stack menu option (pretty much all layer based imaging processing software will enable you to perform the same function but I’ll describe the Photoshop method here) , you can also choose for the images to be aligned which works reasonably well in most cases, though not all. In the screen shot below I’ve selected the individual frames extracted via PIPP.

Loading your images into a stack in Photoshop CC

Loading your images into a stack in Photoshop CC

Once loaded into a stack I set the blending mode on each image to Darken as highlighted below.

Blend the layers in Darken mode

Blend the layers in Darken mode

Once you’ve performed this on each of your frames you should get the result below which shows the track the ISS took when zipping across the Sun (my camera dropped a few frames at the end of the transit as you can see so it ends abruptly).

Your blended composite

Your blended composite

I then crop this around the region of interest, lighten the background, and paste as a layer on top of my finished full disk mosaic (after the transit I made sure I had shot additional frames to enable me to cover the whole face of the Sun). Use blend mode Darken again and, when happy with the result, merge your layers.

Now paste your cropped composite onto your full disk solar image

Now paste your cropped composite onto your full disk solar image

After a little finessing and playing with RGB levels to brighten and render the image in false colour this is what you end up with.

Lightened and blended image rendered in false colour

Lightened and blended image rendered in false colour

The final result

The final result

So now there’s nothing stopping you from trying this for yourself, apart from clouds…

Another Evening of Astrophotography aT the Royal Observatory Greenwich

On the 24th November I'll be back at the Royal Observatory in Greenwich participating in their now annual 'Evening of Astrophotography'. The evening consists of a planetarium show narrated by a well known comedian, last year it was Jon Culshaw who peppered his script with some brilliant impressions of Brian Cox, Patrick Moore and of course Tom Baker. This year the comedian Helen Keen does the honours, if you've ever listened to her BBC Radio 4 show 'It is Rocket Science' then you'll know we are in for a treat.

After the show there is a panel Q&A session which I shall be taking part in along with Will Gater, Melanie Vandenbrouk and Jamen Percy, then there are some workshops and an opportunity to re-fuel in the cafe and browse the galleries to see this years winning entries in Astronomy Photographer of the Year 2016.

A great evening for anyone with an interest in astronomy/astrophotography in a historic location.

http://www.rmg.co.uk/see-do/exhibitions-events/evening-astrophotography

 

Lunar panoramas

Sometimes its fun when you have a high resolution Lunar mosaic to create a panoramic video, I love the impression it gives of floating above the lunar surface.

To create this one I used some free software called 'Instapan' to render the panorama as a video on my phone. Hope you like the result!