Date   

Re: What's the effect of imaging through jet stream?

Worsel
 

CY

Jerry Lodriguss' take on the jet stream and seeing

http://www.astropix.com/html/i_astrop/Planetary_Imaging.html

Bryan


Re: Mach2 Unguided testing continues

Worsel
 

Don

The A-P driver IS free with a mount.  Along with a planetarium program, e.g. Cartes du Ciel or Stellarium (both free), you can control the mount.  You don't need the hand control to do this either.

Bryan


Re: What's the effect of imaging through jet stream?

Cheng-Yang Tan
 

Thanks! Hopefully I’ll never see this problem again after I install the OAG. So, Rolando, the jet stream has never been a problem for you when you image?

On Saturday, May 30, 2020, 5:14 PM, uncarollo2 <chris1011@...> via groups.io <chris1011@...> wrote:

My first and only guess would be that the main scope is flexing with respect to the guidescope. Especially if the guidescope is mounted on the rings of the main scope. That's a guarantee of flexure and poor images. Jet stream - not an issue.

Rolando



-----Original Message-----
From: Cheng-Yang Tan via groups.io <cytan299@...>
To: main@ap-gto.groups.io <main@ap-gto.groups.io>
Sent: Sat, May 30, 2020 2:41 pm
Subject: Re: [ap-gto] What's the effect of imaging through jet stream?

Hi Rolando,
   I used a guide scope that is 4.59 arcsec/pixel (focal length 168 mm). Scope is a FSQ106. I plan to image again tonight. Again, the jet stream seems to be quite bad: 45 m/s. So if there's anything else to check, I'm open to suggestions.

cytan

P.S. I will be upgrading to OAG next week after my SBIG Starchaser SC-2 arrives. This is in preparation for an Adaptive Optics AO-8A in the future.


On Saturday, May 30, 2020, 02:32:25 PM CDT, uncarollo2 <chris1011@...> via groups.io <chris1011@...> wrote:


Are you using an off-axis guider or separate guide scope? What is the main scope you are imaging with?

Rolando





-----Original Message-----
From: Cheng-Yang Tan via groups.io <cytan299@...>
To: main@ap-gto.groups.io
Sent: Sat, May 30, 2020 12:53 pm
Subject: [ap-gto] What's the effect of imaging through jet stream?

Hi guys,
   Hopefully my question is not off topic. But here goes:

   I imaged M101 last night (alt at 76 deg and close to the meridian) with my Mach1GTO and I had to throw away 1/2 my subframes. I examined the bad subframes and there's no consistent direction for the eggy stars from each subframe. Some subframe's eggy stars were in RA direction, some were in DEC direction and some were angled w.r.t. RA and DEC. I looked at the meteoblue seeing map (attached) and it says that the jet stream was at around 31 m/s last night. PHD2 guide graph was about 0.5 arcsec rms error for the entire night which wasn't too bad for the entire session because my image scale is 2.1 arcsec/pixel.

Is the above something that I'd expect imaging through the jet stream? Or I should be looking for something else to blame like flexure?

Thanks (before I start tearing everything apart :) )

cytan


Re: What's the effect of imaging through jet stream?

Roland Christen
 

My first and only guess would be that the main scope is flexing with respect to the guidescope. Especially if the guidescope is mounted on the rings of the main scope. That's a guarantee of flexure and poor images. Jet stream - not an issue.

Rolando



-----Original Message-----
From: Cheng-Yang Tan via groups.io <cytan299@...>
To: main@ap-gto.groups.io <main@ap-gto.groups.io>
Sent: Sat, May 30, 2020 2:41 pm
Subject: Re: [ap-gto] What's the effect of imaging through jet stream?

Hi Rolando,
   I used a guide scope that is 4.59 arcsec/pixel (focal length 168 mm). Scope is a FSQ106. I plan to image again tonight. Again, the jet stream seems to be quite bad: 45 m/s. So if there's anything else to check, I'm open to suggestions.

cytan

P.S. I will be upgrading to OAG next week after my SBIG Starchaser SC-2 arrives. This is in preparation for an Adaptive Optics AO-8A in the future.


On Saturday, May 30, 2020, 02:32:25 PM CDT, uncarollo2 <chris1011@...> via groups.io <chris1011@...> wrote:


Are you using an off-axis guider or separate guide scope? What is the main scope you are imaging with?

Rolando





-----Original Message-----
From: Cheng-Yang Tan via groups.io <cytan299@...>
To: main@ap-gto.groups.io
Sent: Sat, May 30, 2020 12:53 pm
Subject: [ap-gto] What's the effect of imaging through jet stream?

Hi guys,
   Hopefully my question is not off topic. But here goes:

   I imaged M101 last night (alt at 76 deg and close to the meridian) with my Mach1GTO and I had to throw away 1/2 my subframes. I examined the bad subframes and there's no consistent direction for the eggy stars from each subframe. Some subframe's eggy stars were in RA direction, some were in DEC direction and some were angled w.r.t. RA and DEC. I looked at the meteoblue seeing map (attached) and it says that the jet stream was at around 31 m/s last night. PHD2 guide graph was about 0.5 arcsec rms error for the entire night which wasn't too bad for the entire session because my image scale is 2.1 arcsec/pixel.

Is the above something that I'd expect imaging through the jet stream? Or I should be looking for something else to blame like flexure?

Thanks (before I start tearing everything apart :) )

cytan


Re: Mach2 Unguided testing continues

Ray Gralak
 

Hi Don,

I was a little surprise to have to shell out what I did for APCC after spending over eight large
on the mount.
A-P could have increased the price of their mounts and included software with every mount. That's what, for example, Software Bisque does with their mounts. And you have to pay an annual subscription fee if you want to keep getting new features.

But A-P allows each person to decide if they want to purchase the software or not. The same thing with the keypad. You could have been "forced" to pay for the keypad if you wanted your 1100 but you got to save money by not having to. Doesn't this make sense?

on the mount. One would think that software to run it would have been included.
I love the mount, but sorry to say, was disappointed in the control software.
And I am glad you love your mount! I love mine too!

But, APCC wasn't intended to be control software. In fact, one goal was that it be "transparent" to third party control software. APCC Standard adds safety features and new functionality. APCC Pro adds sophisticated pointing and tracking rate correction.

Third party applications don't even know, nor have to know, about APCC's safety features, or that APCC Pro is applying pointing and tracking rate correction. APCC just automatically works and is transparent to every software application using it. I think that people underestimate the value of this.

That said, what do you feel should have been included in terms of control software with the mount?

-Ray Gralak
Author of APCC (Astro-Physics Command Center): https://www.astro-physics.com/apcc-pro
Author of PEMPro V3: https://www.ccdware.com
Author of Astro-Physics V2 ASCOM Driver: https://www.siriusimaging.com/apdriver

-----Original Message-----
From: main@ap-gto.groups.io [mailto:main@ap-gto.groups.io] On Behalf Of Donald Rudny
Sent: Saturday, May 30, 2020 1:19 PM
To: main@ap-gto.groups.io
Subject: Re: [ap-gto] Mach2 Unguided testing continues

Hi Ray,

I think any imaging program would work. From my understanding the images just need to be saved in a folder
that is watched by Atrack. It coordinates Pinpoint to plate solve the images and then adjusts the tracking rate
accordingly. I haven’t used it yet, but have been reviewing the user guide. I assume that any image file format
could be used similar to astrometry.net. I use Starlight Live, and it can save each exposure to a selected folder
as a fits file. I think that will work. Unfortunately it doesn’t solve my Mac issue. Starlight Live does have a
native Windows version, too, but I prefer Mac.

I have the APCC standard version. I do have a Windows 10 machine strictly used for Astro stuff. I use it to
initialize my mount and park it for my home observatory. After initializing, I connect to my iPad and use
SkySafari the rest of the night. My home observatory uses a P3 park position, so that’s why I use APCC. I
understand SkySafari was suppose to add other park positions, but I don’t think that happened yet. They only
use P4 which works well in the field. There I don’t even bring my PC with Windows.

I’ve been a Mac guy since 1984. I cringe every time I even hear the word Windows. I see all the issues
everyone has with it. The idea of having APCC on a Mac or iPad sounds appealing. I just hope I don’t have to
pay for it again. I was a little surprise to have to shell out what I did for APCC after spending over eight large
on the mount. One would think that software to run it would have been included. I love the mount, but sorry to
say, was disappointed in the control software.

Don

Don Rudny


On May 30, 2020, at 7:57 AM, Ray Gralak <groups3@gralak.com> wrote:

Hi Donald,

I’m not trying to say that any system is better than any other. I’m just trying to understand what is
available and what options users may have. Cost is important as well. I believe Atrack is free, but Pinpoint
runs about $150. They have a 60 day trial period. Some users out there with CP3’s might be interested in
this,
too.
Wouldn't you also need an imaging program, like MaximDL Pro or SkyXPro?

I did purchase APCC, but I find it lacking for the price. No
catalogue of objects. No Mac version. Not very intuitive. About the only thing I use it for is to initialize the
mount at my home observatory. Then I switch to my iPad and SkySafari for control.
I take it you have APCC Standard? If you are not using Windows you are missing out then on some safety
and convenience features, like auto-park, and meridian and horizon limits.

So, what if there was an iPad version of APCC with a full planetarium view? One of the plans for the next full
version of APCC is to make it cross platform. I've already started designing it and IOS is one of the possible
target platforms.

-Ray Gralak
Author of APCC (Astro-Physics Command Center): https://www.astro-physics.com/apcc-pro
Author of PEMPro V3: https://www.ccdware.com
Author of Astro-Physics V2 ASCOM Driver: https://www.siriusimaging.com/apdriver


-----Original Message-----
From: main@ap-gto.groups.io [mailto:main@ap-gto.groups.io] On Behalf Of Donald Rudny
Sent: Saturday, May 30, 2020 10:04 AM
To: main@ap-gto.groups.io
Subject: Re: [ap-gto] Mach2 Unguided testing continues

Rolando,

If you’re interested, here’s my mobile setup.




Don Rudny



On May 30, 2020, at 6:59 AM, Donald Rudny via groups.io <mkea13800=gmail.com@groups.io> wrote:




Hi Rolando,

Here is the link to the Pinpoint site.

http://pinpoint.dc3.com/

There is a section on accuracy that suggests that it is very accurate.

I’m not trying to say that any system is better than any other. I’m just trying to understand what is
available and what options users may have. Cost is important as well. I believe Atrack is free, but Pinpoint
runs about $150. They have a 60 day trial period. Some users out there with CP3’s might be interested in
this,
too.

I like your keypad method for what I do, which is usually a portable setup with my AP1100 and C11 Edge
at f/6 or f/2 with Hyperstar. Sometimes I will do f/10. It might not sound very portable, but I put everything
in
the back of my pickup and drive up to Maunakea to set up off the back of the truck. It takes about 20
minutes.
It gets pretty cold, so we sit in the pickup and view the object images on an extended monitor. I run the
cables
through the rear window. When the VIS was open, we did something similar and put on shows for the
visitors
with the equipment there. It was a big hit. I use a Mac, so I was able to download our captured images to
the
visitors iPhones. Some people stayed the whole night just to get all of them. That’s when I first
experienced
your excellent mounts and keypad. They had an AP1100 set up with an 11” RASA. I didn’t purchase a
keypad
with my AP1100 because of the expense. I decided to go with my iPad and SkySafari. I have Luminos too.
Both work very well and are very inexpensive. I did purchase APCC, but I find it lacking for the price. No
catalogue of objects. No Mac version. Not very intuitive. About the only thing I use it for is to initialize the
mount at my home observatory. Then I switch to my iPad and SkySafari for control.

I really don’t like to set up the auto guiding stuff, so your keypad system interests me. I think it would
work best for what I need, but I would need to center the star by eye on my Mac computer screen with
crosshairs. How accurate will that be? It will probably be more valuable for me running at f/6 or f/10 with
the
C11 than at f/2. At f/2, I can already get a couple of minutes unguided just with using the RAPAS for PA.
It’s
f/6 and especially f/10 that drift some.

I do have the CP4, but I would need to invest another $900 for a keypad, so you see my dilemma. I’m
willing to invest, but I need to know that it will really work for me.

Thanks,

Don


Don Rudny



On May 29, 2020, at 7:38 PM, uncarollo2 <chris1011@aol.com> via groups.io
<chris1011=aol.com@groups.io> wrote:






How accurate is centering a star by eye.

I actually do it automatically in MaximDL. I switch the main camera over to guide mode, let the
guider program pick whatever star it wants in the image and let it autoguide. At the end of 3 - 5 minutes I
press
the Enter button on the keypad and let the program advance to the next calibration point. I don't do the
centering manually, i let MaximDL do that. When I have gathered enough points I switch the camera back
over
to imaging and take my exposures.


I can see where Atrack can adjust the guide rates after each image, however that means that
there is drift in the image that is being measured. I'm not 100% convinced that plate solve can determine
the
position to an accuracy level of sub-arc seconds, so i did not pursue this method. Besides, if you're going to
use fancy software for this, then I would simply use APCC Pro and make an all-sky model, and be done with
it.
My program is for portable setups where the user has a mount, scope and keypad, maybe a laptop and
maybe
just a digital camera. No fancy software, just the basics to have some imaging fun.


Rolando



-----Original Message-----
From: Donald Rudny <mkea13800@gmail.com>
To: main@ap-gto.groups.io
Sent: Sat, May 30, 2020 12:12 am
Subject: Re: [ap-gto] Mach2 Unguided testing continues


Rolando,

I haven’t used Atrack, but have been reviewing the manual and I believe the program
continuously adjusts the drift rate as you capture images. They are saved in a file that is watched by the
program and plate solved through Pinpoint. I believe there is also a modeling routine that I assume can be
saved as long as the setup remains the same. As I say, I’m not 100% sure of this, but it looks like it’s worth
taking a look at.

One question I have on the AP keypad system is accuracy. How accurate is centering a star by
eye. I would think that a longer time between inputs would be necessary to improve accuracy. If I have a
fairly
decent PA, I would think I might need a 10-15 minute drift measurement. A longer focal length would also
help.

Don


Don Rudny



On May 29, 2020, at 6:15 PM, uncarollo2 <chris1011@aol.com> via groups.io
<chris1011=aol.com@groups.io> wrote:




Basically you get one data point in about 200 seconds (3 - 4 minutes) which sets the drift
rate at that point int he sky. So that will work for a while, maybe 1/2 hour to an hour, after which the drift will
have changed. So then you take another 3-4 minute run to establish a new drift rate. You do that every hour
or
so along the path that the object takes.


That's exactly what I'm doing also. 3 - 5 minute drift measurement which is good for an
hour. However, if I take just 3 drift measurements along the path, spread out over a 5 - 6 hour period, the
CP5
will then compute a continuously variable tracking rate for all points in between for the entire 6 hour period. I
can do this measurement all at once before the sun goes down using 3 widely spaced stars of Mag4 or
brighter
and an H-a filter. Takes about 15 minutes total. This path is then computed and ready to go when twilight
ends. I can even do another object path at a different Dec, download it and be ready to image two objects.
In
fact, if the two Dec lines are widely spaced, I can image all objects in between also unguided because the
model computes the variable tracking rates for the entire sky area as well as +- 10 degrees outside those
two
Dec lines.



Below see the tracking graphs for the imaging that I am doing tonight. They show how the
tracking rates vary over approx 45 minutes.



Rolando




<dummyfile.0.part>



-----Original Message-----
From: Steven Steven <steven447@hotmail.com>
To: main@ap-gto.groups.io <main@ap-gto.groups.io>
Sent: Fri, May 29, 2020 9:52 pm
Subject: Re: [ap-gto] Mach2 Unguided testing continues


That comment has to do with a long-time on-line chess game, not for you/forum. Sorry, it's
wrong context, I've a busy day on email today.

In reply to your own query, you set up each object separately. I use the Autosave feature
on Maxim and when finished with one object, move to it, train ATrack, an make the second one. It's that
simple.
The User Guide is a good start and will answer your queries without reference to chess moves. 😉

S






Re: Mach2 Unguided testing continues

Don Anderson
 

Hello Bill
That is not quite the complete story. Pinpoint mainly uses locally stored data bases(you need to download them) such as GSC11, UCAC4,USNO 2.0 as well as several others for plate solving. Pinpoint can use Astrometry.net(need an internet connection) when the scope position is too far away to get a local solve. After an Astrometry.net solve, you can then refine your position with another local solve. ANSVR is another option for doing solves when and you do not have an internet connection however the data base is huge and would take many hours to download. Pinpoint is extremely versatile and well worth considering. 
Cheers & Clear Skies
Don Anderson


On Saturday, May 30, 2020, 11:07:48 a.m. MDT, Bill Long <bill@...> wrote:


Reading the site, it seems PinPoint uses ANSVR for offline (non-internet based) plate solving and uses Astrometry.net for online plate solving. Not really sure that is worth $150 considering other applications can use these same tools and methods for free.


From: main@ap-gto.groups.io <main@ap-gto.groups.io> on behalf of Donald Rudny <mkea13800@...>
Sent: Saturday, May 30, 2020 9:59 AM
To: main@ap-gto.groups.io <main@ap-gto.groups.io>
Subject: Re: [ap-gto] Mach2 Unguided testing continues
 
Hi Rolando,

Here is the link to the Pinpoint site.


There is a section on accuracy that suggests that it is very accurate.

I’m not trying to say that any system is better than any other.  I’m just trying to understand what is available and what options users may have.  Cost is important as well.  I believe Atrack is free, but Pinpoint runs about $150.  They have a 60 day trial period.  Some users out there with CP3’s might be interested in this, too.

I like your keypad method for what I do, which is usually a portable setup with my AP1100 and C11 Edge at  f/6 or f/2 with Hyperstar.  Sometimes I will do f/10.  It might not sound very portable, but I put everything in the back of my pickup and drive up to Maunakea to set up off the back of the truck.  It takes about 20 minutes.  It gets pretty cold, so we sit in the pickup and view the object images on an extended monitor.  I run the cables through the rear window.  When the VIS was open, we did something similar and put on shows for the visitors with the equipment there.  It was a big hit.  I use a Mac, so I was able to download our captured images to the visitors iPhones.  Some people stayed the whole night just to get all of them.  That’s when I first experienced your excellent mounts and keypad.  They had an AP1100 set up with an 11” RASA.  I didn’t purchase a keypad with my AP1100 because of the expense.  I decided to go with my iPad and SkySafari.  I have Luminos too.  Both work very well and are very inexpensive.  I did purchase APCC, but I find it lacking for the price.  No catalogue of objects.  No Mac version.  Not very intuitive.  About the only thing I use it for is to initialize the mount at my home observatory.  Then I switch to my iPad and SkySafari for control.

I really don’t like to set up the auto guiding stuff, so your keypad system interests me.  I think it would work best for what I need, but I would need to center the star by eye on my Mac computer screen with crosshairs.  How accurate will that be?  It will probably be more valuable for me running at f/6 or f/10 with the C11 than at f/2.  At f/2, I can already get a couple of minutes unguided just with using the RAPAS for PA.  It’s f/6 and especially f/10 that drift some.  

I do have the CP4, but I would need to invest another $900 for a keypad, so you see my dilemma.  I’m willing to invest, but I need to know that it will really work for me.

Thanks,

Don

Don Rudny


On May 29, 2020, at 7:38 PM, uncarollo2 <chris1011@...> via groups.io <chris1011@...> wrote:



How accurate is centering a star by eye.
I actually do it automatically in MaximDL. I switch the main camera over to guide mode, let the guider program pick whatever star it wants in the image and let it autoguide. At the end of 3  - 5 minutes I press the Enter button on the keypad and let the program advance to the next calibration point. I don't do the centering manually, i let MaximDL do that. When I have gathered enough points I switch the camera back over to imaging and take my exposures.

I can see where Atrack can adjust the guide rates after each image, however that means that there is drift in the image that is being measured. I'm not 100% convinced that plate solve can determine the position to an accuracy level of sub-arc seconds, so i did not pursue this method. Besides, if you're going to use fancy software for this, then I would simply use APCC Pro and make an all-sky model, and be done with it. My program is for portable setups where the user has a mount, scope and keypad, maybe a laptop and maybe just a digital camera. No fancy software, just the basics to have some imaging fun.

Rolando


-----Original Message-----
From: Donald Rudny <mkea13800@...>
To: main@ap-gto.groups.io
Sent: Sat, May 30, 2020 12:12 am
Subject: Re: [ap-gto] Mach2 Unguided testing continues

Rolando,

I haven’t used Atrack, but have been reviewing the manual and I believe the program continuously adjusts the drift rate as you capture images.  They are saved in a file that is watched by the program and plate solved through Pinpoint.  I believe there is also a modeling routine that I assume can be saved as long as the setup remains the same.  As I say, I’m not 100% sure of this, but it looks like it’s worth taking a look at.  

One question I have on the AP keypad system is accuracy.  How accurate is centering a star by eye.  I would think that a longer time between inputs would be necessary to improve accuracy.  If I have a fairly decent PA, I would think I might need a 10-15 minute drift measurement.  A longer focal length would also help.

Don

Don Rudny


On May 29, 2020, at 6:15 PM, uncarollo2 <chris1011@...> via groups.io <chris1011@...> wrote:


Basically you get one data point in about 200 seconds (3 - 4 minutes) which sets the drift rate at that point int he sky. So that will work for a while, maybe 1/2 hour to an hour, after which the drift will have changed. So then you take another 3-4 minute run to establish a new drift rate. You do that every hour or so along the path that the object takes.

That's exactly what I'm doing also. 3 - 5 minute drift measurement which is good for an hour. However, if I take just 3 drift measurements along the path, spread out over a 5 - 6 hour period, the CP5 will then compute a continuously variable tracking rate for all points in between for the entire 6 hour period. I can do this measurement all at once before the sun goes down using 3 widely spaced stars of Mag4 or brighter and an H-a filter.  Takes about 15 minutes total. This path is then computed and ready to go when twilight ends. I can even do another object path at a different Dec, download it and be ready to image two objects. In fact, if the two Dec lines are widely spaced, I can image all objects in between also unguided because the model computes the variable tracking rates for the entire sky area as well as +- 10 degrees outside those two Dec lines.

Below see the tracking graphs for the imaging that I am doing tonight. They show how the tracking rates vary over approx 45 minutes.

Rolando

<dummyfile.0.part>



-----Original Message-----
From: Steven Steven <steven447@...>
To: main@ap-gto.groups.io <main@ap-gto.groups.io>
Sent: Fri, May 29, 2020 9:52 pm
Subject: Re: [ap-gto] Mach2 Unguided testing continues

That comment has to do with a long-time on-line chess game, not for you/forum. Sorry, it's wrong context, I've a busy day on email today.

In reply to your own query, you set up each object separately. I use the Autosave feature on Maxim and when finished with one object, move to it, train ATrack, an make the second one. It's that simple. The User Guide is a good start and will answer your queries without reference to chess moves. 😉 

S



Re: [ap-ug] We're back in Space!

thefamily90 Phillips
 

Yahoo!!


From: main@ap-gto.groups.io <main@ap-gto.groups.io> on behalf of Jay Otts via groups.io <jayotts@...>
Sent: Saturday, May 30, 2020 5:17:12 PM
To: main@ap-gto.groups.io <main@ap-gto.groups.io>; main@ap-ug.groups.io <main@ap-ug.groups.io>
Subject: Re: [ap-gto] [ap-ug] We're back in Space!
 
Ditto!!!



Sent from my T-Mobile 4G LTE Device


-------- Original message --------
From: Pete Lardizabal <p14@...>
Date: 5/30/20 2:46 PM (GMT-06:00)
To: main@ap-ug.groups.io
Cc: main@ap-gto.groups.io
Subject: Re: [ap-gto] [ap-ug] We're back in Space!

👍🏻🇺🇸

WOW!

😎

Pete

On May 30, 2020, at 3:29 PM, Roland Christen via groups.io <chris1011@...> wrote:


Just watched the launch - SUPER COOL! SmileHeart EyesStuck out tongue closed eyes

Rolando


Re: [ap-ug] We're back in Space!

Jay Otts
 

Ditto!!!



Sent from my T-Mobile 4G LTE Device


-------- Original message --------
From: Pete Lardizabal <p14@...>
Date: 5/30/20 2:46 PM (GMT-06:00)
To: main@ap-ug.groups.io
Cc: main@ap-gto.groups.io
Subject: Re: [ap-gto] [ap-ug] We're back in Space!

👍🏻🇺🇸

WOW!

😎

Pete

On May 30, 2020, at 3:29 PM, Roland Christen via groups.io <chris1011@...> wrote:


Just watched the launch - SUPER COOL! SmileHeart EyesStuck out tongue closed eyes

Rolando


Re: Mach2 Unguided testing continues

Donald Rudny
 

Hi Ray,

I think any imaging program would work. From my understanding the images just need to be saved in a folder that is watched by Atrack. It coordinates Pinpoint to plate solve the images and then adjusts the tracking rate accordingly. I haven’t used it yet, but have been reviewing the user guide. I assume that any image file format could be used similar to astrometry.net. I use Starlight Live, and it can save each exposure to a selected folder as a fits file. I think that will work. Unfortunately it doesn’t solve my Mac issue. Starlight Live does have a native Windows version, too, but I prefer Mac.

I have the APCC standard version. I do have a Windows 10 machine strictly used for Astro stuff. I use it to initialize my mount and park it for my home observatory. After initializing, I connect to my iPad and use SkySafari the rest of the night. My home observatory uses a P3 park position, so that’s why I use APCC. I understand SkySafari was suppose to add other park positions, but I don’t think that happened yet. They only use P4 which works well in the field. There I don’t even bring my PC with Windows.

I’ve been a Mac guy since 1984. I cringe every time I even hear the word Windows. I see all the issues everyone has with it. The idea of having APCC on a Mac or iPad sounds appealing. I just hope I don’t have to pay for it again. I was a little surprise to have to shell out what I did for APCC after spending over eight large on the mount. One would think that software to run it would have been included. I love the mount, but sorry to say, was disappointed in the control software.

Don

Don Rudny

On May 30, 2020, at 7:57 AM, Ray Gralak <groups3@gralak.com> wrote:

Hi Donald,

I’m not trying to say that any system is better than any other. I’m just trying to understand what is
available and what options users may have. Cost is important as well. I believe Atrack is free, but Pinpoint
runs about $150. They have a 60 day trial period. Some users out there with CP3’s might be interested in this,
too.
Wouldn't you also need an imaging program, like MaximDL Pro or SkyXPro?

I did purchase APCC, but I find it lacking for the price. No
catalogue of objects. No Mac version. Not very intuitive. About the only thing I use it for is to initialize the
mount at my home observatory. Then I switch to my iPad and SkySafari for control.
I take it you have APCC Standard? If you are not using Windows you are missing out then on some safety and convenience features, like auto-park, and meridian and horizon limits.

So, what if there was an iPad version of APCC with a full planetarium view? One of the plans for the next full version of APCC is to make it cross platform. I've already started designing it and IOS is one of the possible target platforms.

-Ray Gralak
Author of APCC (Astro-Physics Command Center): https://www.astro-physics.com/apcc-pro
Author of PEMPro V3: https://www.ccdware.com
Author of Astro-Physics V2 ASCOM Driver: https://www.siriusimaging.com/apdriver


-----Original Message-----
From: main@ap-gto.groups.io [mailto:main@ap-gto.groups.io] On Behalf Of Donald Rudny
Sent: Saturday, May 30, 2020 10:04 AM
To: main@ap-gto.groups.io
Subject: Re: [ap-gto] Mach2 Unguided testing continues

Rolando,

If you’re interested, here’s my mobile setup.




Don Rudny



On May 30, 2020, at 6:59 AM, Donald Rudny via groups.io <mkea13800=gmail.com@groups.io> wrote:




Hi Rolando,

Here is the link to the Pinpoint site.

http://pinpoint.dc3.com/

There is a section on accuracy that suggests that it is very accurate.

I’m not trying to say that any system is better than any other. I’m just trying to understand what is
available and what options users may have. Cost is important as well. I believe Atrack is free, but Pinpoint
runs about $150. They have a 60 day trial period. Some users out there with CP3’s might be interested in this,
too.

I like your keypad method for what I do, which is usually a portable setup with my AP1100 and C11 Edge
at f/6 or f/2 with Hyperstar. Sometimes I will do f/10. It might not sound very portable, but I put everything in
the back of my pickup and drive up to Maunakea to set up off the back of the truck. It takes about 20 minutes.
It gets pretty cold, so we sit in the pickup and view the object images on an extended monitor. I run the cables
through the rear window. When the VIS was open, we did something similar and put on shows for the visitors
with the equipment there. It was a big hit. I use a Mac, so I was able to download our captured images to the
visitors iPhones. Some people stayed the whole night just to get all of them. That’s when I first experienced
your excellent mounts and keypad. They had an AP1100 set up with an 11” RASA. I didn’t purchase a keypad
with my AP1100 because of the expense. I decided to go with my iPad and SkySafari. I have Luminos too.
Both work very well and are very inexpensive. I did purchase APCC, but I find it lacking for the price. No
catalogue of objects. No Mac version. Not very intuitive. About the only thing I use it for is to initialize the
mount at my home observatory. Then I switch to my iPad and SkySafari for control.

I really don’t like to set up the auto guiding stuff, so your keypad system interests me. I think it would
work best for what I need, but I would need to center the star by eye on my Mac computer screen with
crosshairs. How accurate will that be? It will probably be more valuable for me running at f/6 or f/10 with the
C11 than at f/2. At f/2, I can already get a couple of minutes unguided just with using the RAPAS for PA. It’s
f/6 and especially f/10 that drift some.

I do have the CP4, but I would need to invest another $900 for a keypad, so you see my dilemma. I’m
willing to invest, but I need to know that it will really work for me.

Thanks,

Don


Don Rudny



On May 29, 2020, at 7:38 PM, uncarollo2 <chris1011@aol.com> via groups.io
<chris1011=aol.com@groups.io> wrote:






How accurate is centering a star by eye.

I actually do it automatically in MaximDL. I switch the main camera over to guide mode, let the
guider program pick whatever star it wants in the image and let it autoguide. At the end of 3 - 5 minutes I press
the Enter button on the keypad and let the program advance to the next calibration point. I don't do the
centering manually, i let MaximDL do that. When I have gathered enough points I switch the camera back over
to imaging and take my exposures.


I can see where Atrack can adjust the guide rates after each image, however that means that
there is drift in the image that is being measured. I'm not 100% convinced that plate solve can determine the
position to an accuracy level of sub-arc seconds, so i did not pursue this method. Besides, if you're going to
use fancy software for this, then I would simply use APCC Pro and make an all-sky model, and be done with it.
My program is for portable setups where the user has a mount, scope and keypad, maybe a laptop and maybe
just a digital camera. No fancy software, just the basics to have some imaging fun.


Rolando



-----Original Message-----
From: Donald Rudny <mkea13800@gmail.com>
To: main@ap-gto.groups.io
Sent: Sat, May 30, 2020 12:12 am
Subject: Re: [ap-gto] Mach2 Unguided testing continues


Rolando,

I haven’t used Atrack, but have been reviewing the manual and I believe the program
continuously adjusts the drift rate as you capture images. They are saved in a file that is watched by the
program and plate solved through Pinpoint. I believe there is also a modeling routine that I assume can be
saved as long as the setup remains the same. As I say, I’m not 100% sure of this, but it looks like it’s worth
taking a look at.

One question I have on the AP keypad system is accuracy. How accurate is centering a star by
eye. I would think that a longer time between inputs would be necessary to improve accuracy. If I have a fairly
decent PA, I would think I might need a 10-15 minute drift measurement. A longer focal length would also help.

Don


Don Rudny



On May 29, 2020, at 6:15 PM, uncarollo2 <chris1011@aol.com> via groups.io
<chris1011=aol.com@groups.io> wrote:




Basically you get one data point in about 200 seconds (3 - 4 minutes) which sets the drift
rate at that point int he sky. So that will work for a while, maybe 1/2 hour to an hour, after which the drift will
have changed. So then you take another 3-4 minute run to establish a new drift rate. You do that every hour or
so along the path that the object takes.


That's exactly what I'm doing also. 3 - 5 minute drift measurement which is good for an
hour. However, if I take just 3 drift measurements along the path, spread out over a 5 - 6 hour period, the CP5
will then compute a continuously variable tracking rate for all points in between for the entire 6 hour period. I
can do this measurement all at once before the sun goes down using 3 widely spaced stars of Mag4 or brighter
and an H-a filter. Takes about 15 minutes total. This path is then computed and ready to go when twilight
ends. I can even do another object path at a different Dec, download it and be ready to image two objects. In
fact, if the two Dec lines are widely spaced, I can image all objects in between also unguided because the
model computes the variable tracking rates for the entire sky area as well as +- 10 degrees outside those two
Dec lines.



Below see the tracking graphs for the imaging that I am doing tonight. They show how the
tracking rates vary over approx 45 minutes.



Rolando




<dummyfile.0.part>



-----Original Message-----
From: Steven Steven <steven447@hotmail.com>
To: main@ap-gto.groups.io <main@ap-gto.groups.io>
Sent: Fri, May 29, 2020 9:52 pm
Subject: Re: [ap-gto] Mach2 Unguided testing continues


That comment has to do with a long-time on-line chess game, not for you/forum. Sorry, it's
wrong context, I've a busy day on email today.

In reply to your own query, you set up each object separately. I use the Autosave feature
on Maxim and when finished with one object, move to it, train ATrack, an make the second one. It's that simple.
The User Guide is a good start and will answer your queries without reference to chess moves. 😉

S






Re: [ap-ug] We're back in Space!

Pete Lardizabal
 

👍🏻🇺🇸

WOW!

😎

Pete

On May 30, 2020, at 3:29 PM, Roland Christen via groups.io <chris1011@...> wrote:


Just watched the launch - SUPER COOL! SmileHeart EyesStuck out tongue closed eyes

Rolando


Re: What's the effect of imaging through jet stream?

Leon
 

Stellarvue 80mm with star shoot guide camera sitting on a tmb6” refractor and asi071 camera


On May 30, 2020, at 2:32 PM, uncarollo2 <chris1011@...> via groups.io <chris1011@...> wrote:


Are you using an off-axis guider or separate guide scope? What is the main scope you are imaging with?

Rolando





-----Original Message-----
From: Cheng-Yang Tan via groups.io <cytan299@...>
To: main@ap-gto.groups.io
Sent: Sat, May 30, 2020 12:53 pm
Subject: [ap-gto] What's the effect of imaging through jet stream?

Hi guys,
   Hopefully my question is not off topic. But here goes:

   I imaged M101 last night (alt at 76 deg and close to the meridian) with my Mach1GTO and I had to throw away 1/2 my subframes. I examined the bad subframes and there's no consistent direction for the eggy stars from each subframe. Some subframe's eggy stars were in RA direction, some were in DEC direction and some were angled w.r.t. RA and DEC. I looked at the meteoblue seeing map (attached) and it says that the jet stream was at around 31 m/s last night. PHD2 guide graph was about 0.5 arcsec rms error for the entire night which wasn't too bad for the entire session because my image scale is 2.1 arcsec/pixel.

Is the above something that I'd expect imaging through the jet stream? Or I should be looking for something else to blame like flexure?

Thanks (before I start tearing everything apart :) )

cytan


Re: [ap-ug] We're back in Space!

Greg Hartke
 

Lordy, mama! I was in tears!!!

 

Greg

 

From: main@ap-ug.groups.io <main@ap-ug.groups.io> On Behalf Of Roland Christen via groups.io
Sent: Saturday, May 30, 2020 3:30 PM
To: main@ap-gto.groups.io; main@ap-ug.groups.io
Subject: [ap-ug] We're back in Space!

 

Just watched the launch - SUPER COOL! SmileHeart EyesStuck out tongue closed eyes

 

Rolando


Re: What's the effect of imaging through jet stream?

Cheng-Yang Tan
 

Hi Rolando,
   I used a guide scope that is 4.59 arcsec/pixel (focal length 168 mm). Scope is a FSQ106. I plan to image again tonight. Again, the jet stream seems to be quite bad: 45 m/s. So if there's anything else to check, I'm open to suggestions.

cytan

P.S. I will be upgrading to OAG next week after my SBIG Starchaser SC-2 arrives. This is in preparation for an Adaptive Optics AO-8A in the future.


On Saturday, May 30, 2020, 02:32:25 PM CDT, uncarollo2 <chris1011@...> via groups.io <chris1011@...> wrote:


Are you using an off-axis guider or separate guide scope? What is the main scope you are imaging with?

Rolando





-----Original Message-----
From: Cheng-Yang Tan via groups.io <cytan299@...>
To: main@ap-gto.groups.io
Sent: Sat, May 30, 2020 12:53 pm
Subject: [ap-gto] What's the effect of imaging through jet stream?

Hi guys,
   Hopefully my question is not off topic. But here goes:

   I imaged M101 last night (alt at 76 deg and close to the meridian) with my Mach1GTO and I had to throw away 1/2 my subframes. I examined the bad subframes and there's no consistent direction for the eggy stars from each subframe. Some subframe's eggy stars were in RA direction, some were in DEC direction and some were angled w.r.t. RA and DEC. I looked at the meteoblue seeing map (attached) and it says that the jet stream was at around 31 m/s last night. PHD2 guide graph was about 0.5 arcsec rms error for the entire night which wasn't too bad for the entire session because my image scale is 2.1 arcsec/pixel.

Is the above something that I'd expect imaging through the jet stream? Or I should be looking for something else to blame like flexure?

Thanks (before I start tearing everything apart :) )

cytan


Re: What's the effect of imaging through jet stream?

Roland Christen
 

Are you using an off-axis guider or separate guide scope? What is the main scope you are imaging with?

Rolando





-----Original Message-----
From: Cheng-Yang Tan via groups.io <cytan299@...>
To: main@ap-gto.groups.io
Sent: Sat, May 30, 2020 12:53 pm
Subject: [ap-gto] What's the effect of imaging through jet stream?

Hi guys,
   Hopefully my question is not off topic. But here goes:

   I imaged M101 last night (alt at 76 deg and close to the meridian) with my Mach1GTO and I had to throw away 1/2 my subframes. I examined the bad subframes and there's no consistent direction for the eggy stars from each subframe. Some subframe's eggy stars were in RA direction, some were in DEC direction and some were angled w.r.t. RA and DEC. I looked at the meteoblue seeing map (attached) and it says that the jet stream was at around 31 m/s last night. PHD2 guide graph was about 0.5 arcsec rms error for the entire night which wasn't too bad for the entire session because my image scale is 2.1 arcsec/pixel.

Is the above something that I'd expect imaging through the jet stream? Or I should be looking for something else to blame like flexure?

Thanks (before I start tearing everything apart :) )

cytan


We're back in Space!

Roland Christen
 

Just watched the launch - SUPER COOL! SmileHeart EyesStuck out tongue closed eyes

Rolando


Re: What's the effect of imaging through jet stream?

Leon
 

Sorry to lunch on here, but, was wondering if anyone could tell me on phd2 how agressive should the setting be when using an AP1100? I noticed My settings are around 65 and55. Is there a recommendation for that for the 1100?  Thanks


On May 30, 2020, at 1:08 PM, Mike Shade <mshade@q.com> wrote:



Consistent problems suggest hardware problems, polar alignment, PEC curve issues, flexure, poor guiding parameters, this sort of thing in that they impact everything consistently.  Unusual or rare problems like you are sharing suggest transient issues the most likely being seeing.  I work with a 17" telescope at .63"/pixel, I know how seeing can ruin things.  If the system returns consistent results and then all of a sudden doesn't, first thought is seeing.  The fact that your stars were not messed up in one consistent direction is a clue.  I did have a time where my images were horrible all of a sudden and then get good.  Finally went out and found a rather large owl sitting on the edge of the telescope upper ring.  Luckily there was no deposit on the primary mirror.

 

There are numerous reference on the impact of seeing and how professionals monitor and search for sites with good seeing.

 

Mike J. Shade

Mike J. Shade Photography:

mshadephotography.com

 

In War: Resolution

In Defeat: Defiance

In Victory: Magnanimity

In Peace: Goodwill

Sir Winston Churchill

Already, in the gathering dusk, a few of the stars are turning on their lights.

Vega, the brightest one, is now dropping towards the west.  Can it be half

a year since I watched her April rising in the east?  Low in the southwest

Antares blinks a sad farwell to fall...

Leslie Peltier, Starlight Nights

 

International Dark Sky Association: www.darksky.org

 

From: main@ap-gto.groups.io [mailto:main@ap-gto.groups.io] On Behalf Of Cheng-Yang Tan via groups.io
Sent: Saturday, May 30, 2020 10:53 AM
To: main@ap-gto.groups.io
Subject: [ap-gto] What's the effect of imaging through jet stream?

 

Hi guys,
   Hopefully my question is not off topic. But here goes:

   I imaged M101 last night (alt at 76 deg and close to the meridian) with my Mach1GTO and I had to throw away 1/2 my subframes. I examined the bad subframes and there's no consistent direction for the eggy stars from each subframe. Some subframe's eggy stars were in RA direction, some were in DEC direction and some were angled w.r.t. RA and DEC. I looked at the meteoblue seeing map (attached) and it says that the jet stream was at around 31 m/s last night. PHD2 guide graph was about 0.5 arcsec rms error for the entire night which wasn't too bad for the entire session because my image scale is 2.1 arcsec/pixel.

Is the above something that I'd expect imaging through the jet stream? Or I should be looking for something else to blame like flexure?

Thanks (before I start tearing everything apart :) )

cytan


Re: What's the effect of imaging through jet stream?

Mike Shade
 

Consistent problems suggest hardware problems, polar alignment, PEC curve issues, flexure, poor guiding parameters, this sort of thing in that they impact everything consistently.  Unusual or rare problems like you are sharing suggest transient issues the most likely being seeing.  I work with a 17" telescope at .63"/pixel, I know how seeing can ruin things.  If the system returns consistent results and then all of a sudden doesn't, first thought is seeing.  The fact that your stars were not messed up in one consistent direction is a clue.  I did have a time where my images were horrible all of a sudden and then get good.  Finally went out and found a rather large owl sitting on the edge of the telescope upper ring.  Luckily there was no deposit on the primary mirror.

 

There are numerous reference on the impact of seeing and how professionals monitor and search for sites with good seeing.

 

Mike J. Shade

Mike J. Shade Photography:

mshadephotography.com

 

In War: Resolution

In Defeat: Defiance

In Victory: Magnanimity

In Peace: Goodwill

Sir Winston Churchill

Already, in the gathering dusk, a few of the stars are turning on their lights.

Vega, the brightest one, is now dropping towards the west.  Can it be half

a year since I watched her April rising in the east?  Low in the southwest

Antares blinks a sad farwell to fall...

Leslie Peltier, Starlight Nights

 

International Dark Sky Association: www.darksky.org

 

From: main@ap-gto.groups.io [mailto:main@ap-gto.groups.io] On Behalf Of Cheng-Yang Tan via groups.io
Sent: Saturday, May 30, 2020 10:53 AM
To: main@ap-gto.groups.io
Subject: [ap-gto] What's the effect of imaging through jet stream?

 

Hi guys,
   Hopefully my question is not off topic. But here goes:

   I imaged M101 last night (alt at 76 deg and close to the meridian) with my Mach1GTO and I had to throw away 1/2 my subframes. I examined the bad subframes and there's no consistent direction for the eggy stars from each subframe. Some subframe's eggy stars were in RA direction, some were in DEC direction and some were angled w.r.t. RA and DEC. I looked at the meteoblue seeing map (attached) and it says that the jet stream was at around 31 m/s last night. PHD2 guide graph was about 0.5 arcsec rms error for the entire night which wasn't too bad for the entire session because my image scale is 2.1 arcsec/pixel.

Is the above something that I'd expect imaging through the jet stream? Or I should be looking for something else to blame like flexure?

Thanks (before I start tearing everything apart :) )

cytan


Re: Mach2 Unguided testing continues

Ray Gralak
 

Hi Donald,

I’m not trying to say that any system is better than any other. I’m just trying to understand what is
available and what options users may have. Cost is important as well. I believe Atrack is free, but Pinpoint
runs about $150. They have a 60 day trial period. Some users out there with CP3’s might be interested in this,
too.
Wouldn't you also need an imaging program, like MaximDL Pro or SkyXPro?

I did purchase APCC, but I find it lacking for the price. No
catalogue of objects. No Mac version. Not very intuitive. About the only thing I use it for is to initialize the
mount at my home observatory. Then I switch to my iPad and SkySafari for control.
I take it you have APCC Standard? If you are not using Windows you are missing out then on some safety and convenience features, like auto-park, and meridian and horizon limits.

So, what if there was an iPad version of APCC with a full planetarium view? One of the plans for the next full version of APCC is to make it cross platform. I've already started designing it and IOS is one of the possible target platforms.

-Ray Gralak
Author of APCC (Astro-Physics Command Center): https://www.astro-physics.com/apcc-pro
Author of PEMPro V3: https://www.ccdware.com
Author of Astro-Physics V2 ASCOM Driver: https://www.siriusimaging.com/apdriver


-----Original Message-----
From: main@ap-gto.groups.io [mailto:main@ap-gto.groups.io] On Behalf Of Donald Rudny
Sent: Saturday, May 30, 2020 10:04 AM
To: main@ap-gto.groups.io
Subject: Re: [ap-gto] Mach2 Unguided testing continues

Rolando,

If you’re interested, here’s my mobile setup.




Don Rudny



On May 30, 2020, at 6:59 AM, Donald Rudny via groups.io <mkea13800=gmail.com@groups.io> wrote:




Hi Rolando,

Here is the link to the Pinpoint site.

http://pinpoint.dc3.com/

There is a section on accuracy that suggests that it is very accurate.

I’m not trying to say that any system is better than any other. I’m just trying to understand what is
available and what options users may have. Cost is important as well. I believe Atrack is free, but Pinpoint
runs about $150. They have a 60 day trial period. Some users out there with CP3’s might be interested in this,
too.

I like your keypad method for what I do, which is usually a portable setup with my AP1100 and C11 Edge
at f/6 or f/2 with Hyperstar. Sometimes I will do f/10. It might not sound very portable, but I put everything in
the back of my pickup and drive up to Maunakea to set up off the back of the truck. It takes about 20 minutes.
It gets pretty cold, so we sit in the pickup and view the object images on an extended monitor. I run the cables
through the rear window. When the VIS was open, we did something similar and put on shows for the visitors
with the equipment there. It was a big hit. I use a Mac, so I was able to download our captured images to the
visitors iPhones. Some people stayed the whole night just to get all of them. That’s when I first experienced
your excellent mounts and keypad. They had an AP1100 set up with an 11” RASA. I didn’t purchase a keypad
with my AP1100 because of the expense. I decided to go with my iPad and SkySafari. I have Luminos too.
Both work very well and are very inexpensive. I did purchase APCC, but I find it lacking for the price. No
catalogue of objects. No Mac version. Not very intuitive. About the only thing I use it for is to initialize the
mount at my home observatory. Then I switch to my iPad and SkySafari for control.

I really don’t like to set up the auto guiding stuff, so your keypad system interests me. I think it would
work best for what I need, but I would need to center the star by eye on my Mac computer screen with
crosshairs. How accurate will that be? It will probably be more valuable for me running at f/6 or f/10 with the
C11 than at f/2. At f/2, I can already get a couple of minutes unguided just with using the RAPAS for PA. It’s
f/6 and especially f/10 that drift some.

I do have the CP4, but I would need to invest another $900 for a keypad, so you see my dilemma. I’m
willing to invest, but I need to know that it will really work for me.

Thanks,

Don


Don Rudny



On May 29, 2020, at 7:38 PM, uncarollo2 <chris1011@aol.com> via groups.io
<chris1011=aol.com@groups.io> wrote:






How accurate is centering a star by eye.

I actually do it automatically in MaximDL. I switch the main camera over to guide mode, let the
guider program pick whatever star it wants in the image and let it autoguide. At the end of 3 - 5 minutes I press
the Enter button on the keypad and let the program advance to the next calibration point. I don't do the
centering manually, i let MaximDL do that. When I have gathered enough points I switch the camera back over
to imaging and take my exposures.


I can see where Atrack can adjust the guide rates after each image, however that means that
there is drift in the image that is being measured. I'm not 100% convinced that plate solve can determine the
position to an accuracy level of sub-arc seconds, so i did not pursue this method. Besides, if you're going to
use fancy software for this, then I would simply use APCC Pro and make an all-sky model, and be done with it.
My program is for portable setups where the user has a mount, scope and keypad, maybe a laptop and maybe
just a digital camera. No fancy software, just the basics to have some imaging fun.


Rolando



-----Original Message-----
From: Donald Rudny <mkea13800@gmail.com>
To: main@ap-gto.groups.io
Sent: Sat, May 30, 2020 12:12 am
Subject: Re: [ap-gto] Mach2 Unguided testing continues


Rolando,

I haven’t used Atrack, but have been reviewing the manual and I believe the program
continuously adjusts the drift rate as you capture images. They are saved in a file that is watched by the
program and plate solved through Pinpoint. I believe there is also a modeling routine that I assume can be
saved as long as the setup remains the same. As I say, I’m not 100% sure of this, but it looks like it’s worth
taking a look at.

One question I have on the AP keypad system is accuracy. How accurate is centering a star by
eye. I would think that a longer time between inputs would be necessary to improve accuracy. If I have a fairly
decent PA, I would think I might need a 10-15 minute drift measurement. A longer focal length would also help.

Don


Don Rudny



On May 29, 2020, at 6:15 PM, uncarollo2 <chris1011@aol.com> via groups.io
<chris1011=aol.com@groups.io> wrote:




Basically you get one data point in about 200 seconds (3 - 4 minutes) which sets the drift
rate at that point int he sky. So that will work for a while, maybe 1/2 hour to an hour, after which the drift will
have changed. So then you take another 3-4 minute run to establish a new drift rate. You do that every hour or
so along the path that the object takes.


That's exactly what I'm doing also. 3 - 5 minute drift measurement which is good for an
hour. However, if I take just 3 drift measurements along the path, spread out over a 5 - 6 hour period, the CP5
will then compute a continuously variable tracking rate for all points in between for the entire 6 hour period. I
can do this measurement all at once before the sun goes down using 3 widely spaced stars of Mag4 or brighter
and an H-a filter. Takes about 15 minutes total. This path is then computed and ready to go when twilight
ends. I can even do another object path at a different Dec, download it and be ready to image two objects. In
fact, if the two Dec lines are widely spaced, I can image all objects in between also unguided because the
model computes the variable tracking rates for the entire sky area as well as +- 10 degrees outside those two
Dec lines.



Below see the tracking graphs for the imaging that I am doing tonight. They show how the
tracking rates vary over approx 45 minutes.



Rolando




<dummyfile.0.part>



-----Original Message-----
From: Steven Steven <steven447@hotmail.com>
To: main@ap-gto.groups.io <main@ap-gto.groups.io>
Sent: Fri, May 29, 2020 9:52 pm
Subject: Re: [ap-gto] Mach2 Unguided testing continues


That comment has to do with a long-time on-line chess game, not for you/forum. Sorry, it's
wrong context, I've a busy day on email today.

In reply to your own query, you set up each object separately. I use the Autosave feature
on Maxim and when finished with one object, move to it, train ATrack, an make the second one. It's that simple.
The User Guide is a good start and will answer your queries without reference to chess moves. 😉

S




What's the effect of imaging through jet stream?

Cheng-Yang Tan
 

Hi guys,
   Hopefully my question is not off topic. But here goes:

   I imaged M101 last night (alt at 76 deg and close to the meridian) with my Mach1GTO and I had to throw away 1/2 my subframes. I examined the bad subframes and there's no consistent direction for the eggy stars from each subframe. Some subframe's eggy stars were in RA direction, some were in DEC direction and some were angled w.r.t. RA and DEC. I looked at the meteoblue seeing map (attached) and it says that the jet stream was at around 31 m/s last night. PHD2 guide graph was about 0.5 arcsec rms error for the entire night which wasn't too bad for the entire session because my image scale is 2.1 arcsec/pixel.

Is the above something that I'd expect imaging through the jet stream? Or I should be looking for something else to blame like flexure?

Thanks (before I start tearing everything apart :) )

cytan


Re: Mach2 Unguided testing continues

Donald Rudny
 

I use Starlight Live.  I have PHD on my Mac also, so I could switch over to that for the calibration.  That would require going through the PHD calibration on a selected star for each object and maybe each calibration point.  It would be simpler to just use the keypad and the crosshairs by eye in the imaging program.  If the drift is significant in say, one or two minutes, I think it would be pretty accurate, especially at the longer focal lengths where I need it.  Like you wrote earlier, keep it simple and have more time for imaging fun.

Don

Don Rudny


On May 30, 2020, at 7:13 AM, uncarollo2 <chris1011@...> via groups.io <chris1011@...> wrote:


What program do you use to image? I use MaximDL which has both imaging and guide capability from the same chip. So for taking centering data I use the guide program, then for imaging I simply switch over to the imaging portion. I believe that PHD2 also has both capabilities. No separate guide camera needed, one camera does both data gathering during twilight and imaging after dusk.

Does that make sense or am I not explaining it well?

Rolando



-----Original Message-----
From: Donald Rudny <mkea13800@...>
To: main@ap-gto.groups.io
Sent: Sat, May 30, 2020 11:59 am
Subject: Re: [ap-gto] Mach2 Unguided testing continues

Hi Rolando,

Here is the link to the Pinpoint site.


There is a section on accuracy that suggests that it is very accurate.

I’m not trying to say that any system is better than any other.  I’m just trying to understand what is available and what options users may have.  Cost is important as well.  I believe Atrack is free, but Pinpoint runs about $150.  They have a 60 day trial period.  Some users out there with CP3’s might be interested in this, too.

I like your keypad method for what I do, which is usually a portable setup with my AP1100 and C11 Edge at  f/6 or f/2 with Hyperstar.  Sometimes I will do f/10.  It might not sound very portable, but I put everything in the back of my pickup and drive up to Maunakea to set up off the back of the truck.  It takes about 20 minutes.  It gets pretty cold, so we sit in the pickup and view the object images on an extended monitor.  I run the cables through the rear window.  When the VIS was open, we did something similar and put on shows for the visitors with the equipment there.  It was a big hit.  I use a Mac, so I was able to download our captured images to the visitors iPhones.  Some people stayed the whole night just to get all of them.  That’s when I first experienced your excellent mounts and keypad.  They had an AP1100 set up with an 11” RASA.  I didn’t purchase a keypad with my AP1100 because of the expense.  I decided to go with my iPad and SkySafari.  I have Luminos too.  Both work very well and are very inexpensive.  I did purchase APCC, but I find it lacking for the price.  No catalogue of objects.  No Mac version.  Not very intuitive.  About the only thing I use it for is to initialize the mount at my home observatory.  Then I switch to my iPad and SkySafari for control.

I really don’t like to set up the auto guiding stuff, so your keypad system interests me.  I think it would work best for what I need, but I would need to center the star by eye on my Mac computer screen with crosshairs.  How accurate will that be?  It will probably be more valuable for me running at f/6 or f/10 with the C11 than at f/2.  At f/2, I can already get a couple of minutes unguided just with using the RAPAS for PA.  It’s f/6 and especially f/10 that drift some.  

I do have the CP4, but I would need to invest another $900 for a keypad, so you see my dilemma.  I’m willing to invest, but I need to know that it will really work for me.

Thanks,

Don

Don Rudny


On May 29, 2020, at 7:38 PM, uncarollo2 <chris1011@...> via groups.io <chris1011@...> wrote:



How accurate is centering a star by eye.
I actually do it automatically in MaximDL. I switch the main camera over to guide mode, let the guider program pick whatever star it wants in the image and let it autoguide. At the end of 3  - 5 minutes I press the Enter button on the keypad and let the program advance to the next calibration point. I don't do the centering manually, i let MaximDL do that. When I have gathered enough points I switch the camera back over to imaging and take my exposures.

I can see where Atrack can adjust the guide rates after each image, however that means that there is drift in the image that is being measured. I'm not 100% convinced that plate solve can determine the position to an accuracy level of sub-arc seconds, so i did not pursue this method. Besides, if you're going to use fancy software for this, then I would simply use APCC Pro and make an all-sky model, and be done with it. My program is for portable setups where the user has a mount, scope and keypad, maybe a laptop and maybe just a digital camera. No fancy software, just the basics to have some imaging fun.

Rolando


-----Original Message-----
From: Donald Rudny <mkea13800@...>
To: main@ap-gto.groups.io
Sent: Sat, May 30, 2020 12:12 am
Subject: Re: [ap-gto] Mach2 Unguided testing continues

Rolando,

I haven’t used Atrack, but have been reviewing the manual and I believe the program continuously adjusts the drift rate as you capture images.  They are saved in a file that is watched by the program and plate solved through Pinpoint.  I believe there is also a modeling routine that I assume can be saved as long as the setup remains the same.  As I say, I’m not 100% sure of this, but it looks like it’s worth taking a look at.  

One question I have on the AP keypad system is accuracy.  How accurate is centering a star by eye.  I would think that a longer time between inputs would be necessary to improve accuracy.  If I have a fairly decent PA, I would think I might need a 10-15 minute drift measurement.  A longer focal length would also help.

Don

Don Rudny


On May 29, 2020, at 6:15 PM, uncarollo2 <chris1011@...> via groups.io <chris1011@...> wrote:


Basically you get one data point in about 200 seconds (3 - 4 minutes) which sets the drift rate at that point int he sky. So that will work for a while, maybe 1/2 hour to an hour, after which the drift will have changed. So then you take another 3-4 minute run to establish a new drift rate. You do that every hour or so along the path that the object takes.

That's exactly what I'm doing also. 3 - 5 minute drift measurement which is good for an hour. However, if I take just 3 drift measurements along the path, spread out over a 5 - 6 hour period, the CP5 will then compute a continuously variable tracking rate for all points in between for the entire 6 hour period. I can do this measurement all at once before the sun goes down using 3 widely spaced stars of Mag4 or brighter and an H-a filter.  Takes about 15 minutes total. This path is then computed and ready to go when twilight ends. I can even do another object path at a different Dec, download it and be ready to image two objects. In fact, if the two Dec lines are widely spaced, I can image all objects in between also unguided because the model computes the variable tracking rates for the entire sky area as well as +- 10 degrees outside those two Dec lines.

Below see the tracking graphs for the imaging that I am doing tonight. They show how the tracking rates vary over approx 45 minutes.

Rolando

<dummyfile.0.part>



-----Original Message-----
From: Steven Steven <steven447@...>
To: main@ap-gto.groups.io <main@ap-gto.groups.io>
Sent: Fri, May 29, 2020 9:52 pm
Subject: Re: [ap-gto] Mach2 Unguided testing continues

That comment has to do with a long-time on-line chess game, not for you/forum. Sorry, it's wrong context, I've a busy day on email today.

In reply to your own query, you set up each object separately. I use the Autosave feature on Maxim and when finished with one object, move to it, train ATrack, an make the second one. It's that simple. The User Guide is a good start and will answer your queries without reference to chess moves. 😉 

S


8381 - 8400 of 79102