Пікірлер
@CGastro
@CGastro 4 күн бұрын
Let;s take this one step further. Say I WBPPed a few nights and kept the calibrated files and all than. Then I add a couple of night more some time later. Is there a way to save the tiem of recalibrating the initial set of data and calibrating just the added data? Obviously measurements, LN (if applicable), registration, etc. must be done for the whole set.
@AdamBlock
@AdamBlock 3 күн бұрын
Yes, do the calibration of the new data alone with WBPP. Then run postprocessing on the entire set of data with WBPP. If you want to get fancy... if you know your registration reference you can include this in the first step. Then you do the rest of the data post processing with WBPP on the complete set so LN and Integration will be done on the complete set.
@rocketcityastro
@rocketcityastro 5 күн бұрын
I can't wait to play with this one. great job Adam!!
@addos999
@addos999 9 күн бұрын
you sure do come up with innovative ways to exploit pix's tools Adam, bravo!
@Microtonal_Cats
@Microtonal_Cats 10 күн бұрын
Is Drizzle check box removed in this? I can't tell if we should just check "Rejection maps / drizzle files" in Calibration? It's not clear in the documentation if that runs Drizzle, or just prepares for it
@AdamBlock
@AdamBlock 9 күн бұрын
No drizzle ...
@richardsauerbrun2412
@richardsauerbrun2412 11 күн бұрын
I am also having trouble adding the script to my repository. I have added the link, saved, and then ran the update. The update showed new items, which I applied and then restarted PI. But the script does not appear under Utilities. I have screen shots if they will help. I consider myself competent in this area as I have added numerous items to my PI repository.
@AdamBlock
@AdamBlock 10 күн бұрын
Sure... a screenshot of your repository. Use my forum on my site (I assume you are a member)
@davecurtis8833
@davecurtis8833 11 күн бұрын
Wonderful location and wonderful image
@AdamBlock
@AdamBlock 10 күн бұрын
Thank you!
@ssbhide123
@ssbhide123 13 күн бұрын
I don't have Pixinsight yet but I like these videos 😅 Btw how is this different than using defringing option in camera raw filter of Adobe PS? It does a surprisingly good job with just one click!
@AdamBlock
@AdamBlock 13 күн бұрын
BXT is a restoration algorithm. This is how it is different...it is attempting to reconstruct the PSF (the star intensity distribution). PS Defringing is an edge modification... not a restoration.
@michael.a.covington
@michael.a.covington 18 күн бұрын
58:49 This reminds me very much of wavelet sharpening of images of Mars. There is a lower limit on size, and oversharpening causes artifacts of that size.
@burninmind
@burninmind 25 күн бұрын
Amazing!
@CedricThomas34
@CedricThomas34 Ай бұрын
Wonderful presentation Adam!
@malcolqwe2
@malcolqwe2 Ай бұрын
as per usual in my world, this doesnt work for me. After selecting edit, the target keyword list is empty. and I know there are values to be listed. In my situation, I am trying to plate solve a bunch of files in a mosaic. But the focal length is wrong in the fits header. I thought if I could edit the header to the right values solving would work. Oh well. maybe theres ANOTHER way
@AdamBlock
@AdamBlock Ай бұрын
In your world... is it s FITS or an XISF file? Only works on FITS. Check the header . You know where to find me so you can post screenshots.
@malcolqwe2
@malcolqwe2 Ай бұрын
@@AdamBlock of course - its been processed six ways to sunday - its an xisf :( thanks!
@matth6161
@matth6161 Ай бұрын
anyone else having issues with the script download? I put the address in PI and checked for updates. It downloaded but doesnt appear anywhere to open. The site says will open under scripts> utilities it isnt there any ideas?
@ObesAU75
@ObesAU75 Ай бұрын
This video not only shows you how but also explains why. It's amazing!
@AdamBlock
@AdamBlock Ай бұрын
Thank you for the encouraging feedback.
@anata5127
@anata5127 Ай бұрын
LOL!
@yervantparnagian5999
@yervantparnagian5999 Ай бұрын
I know this is an older video, but why do you state Flat frames could have an exposure of 30 or more seconds? Seems everyone is preaching Flat frames should be just 3-4 seconds tops. HAs something changed since this video has come out?
@AdamBlock
@AdamBlock Ай бұрын
Nothing has changed. There is nothing wrong with longer exposures to achieve a particular light level. Narrowband filters sometimes require this- especially red ones since EL panels (for example) do not emit much red light. The point I was making is that the dark current is very low in exposures of even 30 seconds or more for many cameras. If this is the case, then you can just use a bias to calibrate for any time... even as "long" as 30 seconds.
@yervantparnagian5999
@yervantparnagian5999 Ай бұрын
Thank you Adam for the reply and all you do for the Astro community
@MarcelBlattner
@MarcelBlattner Ай бұрын
Thanks. Great tutorial!
@FredLombardo
@FredLombardo Ай бұрын
Where is this GAME script? I don’t have it
@AdamBlock
@AdamBlock Ай бұрын
AdamBlockStudios.com
@FredLombardo
@FredLombardo Ай бұрын
@@AdamBlock I tried this last night on Tsuchinshan-ATLAS data I took earlier in the week. I am going to have to review the video again as when I set the mask over the comet and ran STX, my results were I completely white image and the, very distorted comet view, alone in the mask window. Not the results I was hoping lol. Thank you for what you do😊
@AdamBlock
@AdamBlock Ай бұрын
@@FredLombardo The YT videos I make tend not to be entirely full explanations. That takes longer. Also, this is an old video. The techniques have improved. I created Comet Academy- the only dedicated comet processing course (in Pixinsght and others) in the world. If you want to know everything get Comet Academy. www.adamblockstudios.com/categories/comet-academy/
@nicolassavard6860
@nicolassavard6860 Ай бұрын
How do I add my flats/bias/and darks in fast integration?
@AdamBlock
@AdamBlock Ай бұрын
You don't. FastIntegration is a post processing tool. You calibrate the data first before using it. FastIntegration is now available in WBPP... or you can use WBPP to calibrate first and then manually run FastIntegration.
@DrewJEvans44
@DrewJEvans44 Ай бұрын
Excellent as usual, Adam.
@lesgatechair3907
@lesgatechair3907 Ай бұрын
You can also blend two monochrome images. I did this to mix bin1 and bin2 masters
@rickbria8420
@rickbria8420 Ай бұрын
Would you use This Image Blend script for adding continuum subtracted Ha data to RGB?
@AdamBlock
@AdamBlock Ай бұрын
You could. However the NB ColourMapper script can be used to manage the *color* of the blend. Usually you would blend with Screen which is indeed one of the options in ImageBlend- but the color management part is done in NB ColourMapper.
@rickbria8420
@rickbria8420 Ай бұрын
@@AdamBlock Thanks Adam, just what I wanted to know.
@Maxastro59
@Maxastro59 Ай бұрын
Sorry Adam but I must have missed something. I saw in Fundamentals the whole process for developing M83. However I was quite "shocked" by the lack of calibration process. What happened to the famous dark-fdlat-bias frames? Maybe you explained it somewhere?
@AdamBlock
@AdamBlock Ай бұрын
TelescopeLive provides only calibrated images. Other workflow examples show the calibration (WBPP processing). It is pretty much the same thing each time! :) I will be creating more content with more WBPP tutorials... the point of this section was the LRGB instruction (as well as more secret processing). Everyone always asks me to show what to do AFTER the initial processing... this is what I did.
@Z-add
@Z-add Ай бұрын
why don't you use oneshot color image for reference only.
@stef2499
@stef2499 Ай бұрын
Why did you not background neutralize this one?
@jackwmes
@jackwmes Ай бұрын
One thing I find striking is the improvement in sharpness of the ImageBlend result compared to the LRGBCombination. It looks like your L image is sharper than the RGB, so is LRGBCombination not adequately retaining that level of detail in the result?? I thought that was the whole point of the tool!
@Neanderthal75
@Neanderthal75 Ай бұрын
Adam, you are solving my biggest gripe so far, regarding LRGB imaging. I avoided taking L-s for the reason that producing an LRGB image was a 50-50 chance turning out good or bad. Eventually I find it to be a waste of time, because I buried the L with the curves and it did nothing to the image. I'm gonna try this method ASAP! Thanks again!
@AdamBlock
@AdamBlock Ай бұрын
Great... I hope this helps. This is exactly what is done in Photoshop... so it should work just fine. All of the behaviors I learned over the years (previous to PI) now have come back into play. :)
@rvoykin
@rvoykin Ай бұрын
I’ve just been using the color and luminous blend mode in Photoshop, which has been really effective, but curious about trying it this way
@gclaytony
@gclaytony Ай бұрын
Interesting new tool Adam, and certainly one to try out. However, a bit underwhelmed by the comparison between the LRGB color channel combination method and the script. One thing that the script rather clearly demonstrates is that there is a need to ensure some form of DBE/gradient removal be done and that color image be color corrected (as would be a normal process) before proceeding to the 'stretch' and subsequent steps. If the component images have garbage, that seems to be ehanced in the combination process. The LRGB color combination image seemed to show an increase in black level and some reduction of the color bias/gradient relative to the script created image (at least as was visible in the KZbin video). How does the script tool work with starless images created with tools like SXT? I normally separate the stars out and work with them separately from the object (except when the object is a globular cluster or something similar). That allows the freedom do work with colors/saturation/contrast without impacting the star color or creating 'bloom'. How does it react when the Lum layer is also used as a mask to prevent changing/impacting the background during the combination process? I like that the tool provides a means to preview the result, something the LRGB channel combination tool lacks. At first glance that seems to be the biggest single benefit, since the LRGB CC tool requires trial and error to get things 'just right'.
@MrStacaz99
@MrStacaz99 Ай бұрын
What is going on here? Was something in Pixinsight made EASIER to use?? I really don't understand how this can happen. It runs counter to the entire product philosophy behind Pixinsight!! I may have to delete my installation of it!
@AdamBlock
@AdamBlock Ай бұрын
Ha ha... I have the same issues... look I made a 9 minute video..I am surprised you watched it given its brevity. :)
@GhostSenshi
@GhostSenshi Ай бұрын
Very nice. Thank you
@ACKitsBilltheCAT
@ACKitsBilltheCAT Ай бұрын
Thanks for this free public teaser! I just watched the entire series in Fundamentals on your website this weekend, and got a lot out of it - highly recommended!
@AdamBlock
@AdamBlock Ай бұрын
Great... Thank you for being a member!
@physmc1
@physmc1 Ай бұрын
Amazing work. Looking forward to trying the script on my data.
@AdamBlock
@AdamBlock Ай бұрын
Great... I think you like it.
@DSOImager
@DSOImager Ай бұрын
That's cool. I bet this would work well for Ha, LRGB images.
@Phenolisothiocyanate
@Phenolisothiocyanate Ай бұрын
Hi Adam, I'm very near the end of the Fundamentals path. There's a recurring question that I've never had answered satisfactorily: If you've shot enough RGB that color noise in your target is well controlled then why do you need L? Your SNR is, by definition, already high so what's the benefit? I often see images where L is used to bring out faint details more quickly and the overall structure looks fine. The color, however, is usually mottled or smeared, presumably from insufficient RGB. It appears that the color data gets heavily processed to control noise with tools like Blurxterminator or with convolution. While this can look ok superficially it doesn't look very good under even casual closer inspection. Is there something I'm missing? Thanks!
@AdamBlock
@AdamBlock Ай бұрын
You aren't missing anything. You will note in my explanation for LRGB in my tutorial I talked about why LRGB was useful in the past (time savings through binning to reduce read-noise). However, this isn't true any longer with CMOS. So why did I make the video? Well... people are still doing LRGB! lol I do not think there is a big benefit other than some time savings where it isn't possible to acquire as much color as desired. That is really it I think.
@Phenolisothiocyanate
@Phenolisothiocyanate Ай бұрын
@@AdamBlock Thanks! As an experiment I processed NGC4565 with a synthetic L composed of integrated RGB masters and then with a true L. The difference was negligible. For bright targets I think I'll shoot pure RGB. For faint stuff like IFN I'll shoot LRGB because some parts are just too faint to get SNR on without 40+ hours of combined RGB. Thanks again! Btw, your section on image calibration helped me solve an issue I was having with my ASI6200MM. 😌
@danielshade710
@danielshade710 Ай бұрын
I also don’t see any benefit to using real L. Plus I really don’t care to stack 3X the images just to access control of certain portions of the image. There’s a lot of great tools for contrast control
@rafaberrios8142
@rafaberrios8142 Ай бұрын
This is so awesome. Can I do this with blending lumens from mono with OSC data?
@AdamBlock
@AdamBlock Ай бұрын
Yep. This is open-ended.
@JK-gn9qi
@JK-gn9qi Ай бұрын
Nice! When in the process do you integrate them? At the start or after first processing both the luminance and rgb?
@AdamBlock
@AdamBlock Ай бұрын
Now...that is question that is answered on my site in my tutorials (you should consider becoming a member!). I am not certain I understand what you mean by "integrate"... you do this after you have permanently stretched them...then you put them together and work on the image from that point. Is this what you mean?
@JK-gn9qi
@JK-gn9qi Ай бұрын
@@AdamBlock Yes that is what i meant, thank you very much!
@haiderbhogadia4829
@haiderbhogadia4829 Ай бұрын
Hi Adam with OSC do you extract a luminance image from the RGB image?
@AdamBlock
@AdamBlock Ай бұрын
You could... but the luminance (lightness) is the same information as when you keep it all togther as a single color image. There isn't a benefit of extracting this information unless you plan to take a wildly different path with it... but I cannot think of a reason right now that would be beneficial.
@DylanODonnell
@DylanODonnell Ай бұрын
Thanks Adam!
@AdamBlock
@AdamBlock Ай бұрын
Thanks! I did write you to many times concerning GalaCell... but the e-mails would not go through! Do you have any other means of contact? I did get yours through my site.
@Ekuy1
@Ekuy1 Ай бұрын
looks great! is LRGB combination (native pix version) supposed to be a non-linear process? I've always used it in the linear stage with linearfitted, calibrated and background neutralised images, which would give me a very flat result that I'd manipulate further in processing. This is because I have quite a practiced and developed RGB processing workflow, and I feel more comfortable with using the regular Pix tools for my non-linear processes (curves, masked brightness adjustments etc.). With your tool, would I be able to still do the L+RGB combination in a linear state, not a non-linear one as you showed in the video, so that I can stretch and process them together?
@AdamBlock
@AdamBlock Ай бұрын
The way I demonstrated its usage is how it was designed to be used for this kind of LRGB result. There is a thread about this and an explanation by Juan Conejero to this effect.
@robertocoleschi4025
@robertocoleschi4025 Ай бұрын
Adam provide to us, without any doubts, the best tutorials about image processing wordlwide, from over twenty years (and many of them are free).The investiment in your tutorials are the best expense that an amateur can afford. Thanks a lot, for your time and creativity.
@AdamBlock
@AdamBlock Ай бұрын
Thank you!
@simonpepper5053
@simonpepper5053 Ай бұрын
Thanks for being so amazing Adam!
@dirkseidel4531
@dirkseidel4531 Ай бұрын
Perfect this is what I m looking for LRGB was always difficult for me to combine try different methods thank you Adam for your work greetings from switzerland
@AdamBlock
@AdamBlock Ай бұрын
Thanks for watching!
@starsips
@starsips Ай бұрын
Amazing Adam! This script is most welcome! Thank you
@AdamBlock
@AdamBlock Ай бұрын
Thanks for watching!
@englishsubplzzaoki3445
@englishsubplzzaoki3445 Ай бұрын
pulsar
@englishsubplzzaoki3445
@englishsubplzzaoki3445 Ай бұрын
pulsar
@ffme33
@ffme33 Ай бұрын
Oh wow I can't remember what I did on this date but you did something incredible, thank you
@AdamBlock
@AdamBlock Ай бұрын
Thank you for watching!
@mikehardy8247
@mikehardy8247 Ай бұрын
That fireball is bonus. The ISS went through a couple of my frames, which I thought was pretty kool.
@AdamBlock
@AdamBlock Ай бұрын
Excellent! When the fireball happened... my wife (who was behind me) exclaimed "aahhhh!!" So I turned to look at her..... I did capture it in the sequence though. lol
@steveduffy8726
@steveduffy8726 Ай бұрын
Like all your images, this one turned out wonderfully well.
@Jekaniah-jm7gq
@Jekaniah-jm7gq Ай бұрын
What a beautiful video. How exciting this must have been for you to see. Thanks for sharing.
@kerrygreen9064
@kerrygreen9064 Ай бұрын
I will continue to state how unprepared I was for just how extensive this storm was at 37° N. I remember seeing aurora as a child and it was a reddish sky with some rays toward the north. This is what I was expecting. I was completely taken aback by the display. There were curtains, pickets, blobs, corona aurora… The display extend toward the Southern Sky. Utterly amazing and emotionally evocative.