Let;s take this one step further. Say I WBPPed a few nights and kept the calibrated files and all than. Then I add a couple of night more some time later. Is there a way to save the tiem of recalibrating the initial set of data and calibrating just the added data? Obviously measurements, LN (if applicable), registration, etc. must be done for the whole set.
@AdamBlock3 күн бұрын
Yes, do the calibration of the new data alone with WBPP. Then run postprocessing on the entire set of data with WBPP. If you want to get fancy... if you know your registration reference you can include this in the first step. Then you do the rest of the data post processing with WBPP on the complete set so LN and Integration will be done on the complete set.
@rocketcityastro5 күн бұрын
I can't wait to play with this one. great job Adam!!
@addos9999 күн бұрын
you sure do come up with innovative ways to exploit pix's tools Adam, bravo!
@Microtonal_Cats10 күн бұрын
Is Drizzle check box removed in this? I can't tell if we should just check "Rejection maps / drizzle files" in Calibration? It's not clear in the documentation if that runs Drizzle, or just prepares for it
@AdamBlock9 күн бұрын
No drizzle ...
@richardsauerbrun241211 күн бұрын
I am also having trouble adding the script to my repository. I have added the link, saved, and then ran the update. The update showed new items, which I applied and then restarted PI. But the script does not appear under Utilities. I have screen shots if they will help. I consider myself competent in this area as I have added numerous items to my PI repository.
@AdamBlock10 күн бұрын
Sure... a screenshot of your repository. Use my forum on my site (I assume you are a member)
@davecurtis883311 күн бұрын
Wonderful location and wonderful image
@AdamBlock10 күн бұрын
Thank you!
@ssbhide12313 күн бұрын
I don't have Pixinsight yet but I like these videos 😅 Btw how is this different than using defringing option in camera raw filter of Adobe PS? It does a surprisingly good job with just one click!
@AdamBlock13 күн бұрын
BXT is a restoration algorithm. This is how it is different...it is attempting to reconstruct the PSF (the star intensity distribution). PS Defringing is an edge modification... not a restoration.
@michael.a.covington18 күн бұрын
58:49 This reminds me very much of wavelet sharpening of images of Mars. There is a lower limit on size, and oversharpening causes artifacts of that size.
@burninmind25 күн бұрын
Amazing!
@CedricThomas34Ай бұрын
Wonderful presentation Adam!
@malcolqwe2Ай бұрын
as per usual in my world, this doesnt work for me. After selecting edit, the target keyword list is empty. and I know there are values to be listed. In my situation, I am trying to plate solve a bunch of files in a mosaic. But the focal length is wrong in the fits header. I thought if I could edit the header to the right values solving would work. Oh well. maybe theres ANOTHER way
@AdamBlockАй бұрын
In your world... is it s FITS or an XISF file? Only works on FITS. Check the header . You know where to find me so you can post screenshots.
@malcolqwe2Ай бұрын
@@AdamBlock of course - its been processed six ways to sunday - its an xisf :( thanks!
@matth6161Ай бұрын
anyone else having issues with the script download? I put the address in PI and checked for updates. It downloaded but doesnt appear anywhere to open. The site says will open under scripts> utilities it isnt there any ideas?
@ObesAU75Ай бұрын
This video not only shows you how but also explains why. It's amazing!
@AdamBlockАй бұрын
Thank you for the encouraging feedback.
@anata5127Ай бұрын
LOL!
@yervantparnagian5999Ай бұрын
I know this is an older video, but why do you state Flat frames could have an exposure of 30 or more seconds? Seems everyone is preaching Flat frames should be just 3-4 seconds tops. HAs something changed since this video has come out?
@AdamBlockАй бұрын
Nothing has changed. There is nothing wrong with longer exposures to achieve a particular light level. Narrowband filters sometimes require this- especially red ones since EL panels (for example) do not emit much red light. The point I was making is that the dark current is very low in exposures of even 30 seconds or more for many cameras. If this is the case, then you can just use a bias to calibrate for any time... even as "long" as 30 seconds.
@yervantparnagian5999Ай бұрын
Thank you Adam for the reply and all you do for the Astro community
@MarcelBlattnerАй бұрын
Thanks. Great tutorial!
@FredLombardoАй бұрын
Where is this GAME script? I don’t have it
@AdamBlockАй бұрын
AdamBlockStudios.com
@FredLombardoАй бұрын
@@AdamBlock I tried this last night on Tsuchinshan-ATLAS data I took earlier in the week. I am going to have to review the video again as when I set the mask over the comet and ran STX, my results were I completely white image and the, very distorted comet view, alone in the mask window. Not the results I was hoping lol. Thank you for what you do😊
@AdamBlockАй бұрын
@@FredLombardo The YT videos I make tend not to be entirely full explanations. That takes longer. Also, this is an old video. The techniques have improved. I created Comet Academy- the only dedicated comet processing course (in Pixinsght and others) in the world. If you want to know everything get Comet Academy. www.adamblockstudios.com/categories/comet-academy/
@nicolassavard6860Ай бұрын
How do I add my flats/bias/and darks in fast integration?
@AdamBlockАй бұрын
You don't. FastIntegration is a post processing tool. You calibrate the data first before using it. FastIntegration is now available in WBPP... or you can use WBPP to calibrate first and then manually run FastIntegration.
@DrewJEvans44Ай бұрын
Excellent as usual, Adam.
@lesgatechair3907Ай бұрын
You can also blend two monochrome images. I did this to mix bin1 and bin2 masters
@rickbria8420Ай бұрын
Would you use This Image Blend script for adding continuum subtracted Ha data to RGB?
@AdamBlockАй бұрын
You could. However the NB ColourMapper script can be used to manage the *color* of the blend. Usually you would blend with Screen which is indeed one of the options in ImageBlend- but the color management part is done in NB ColourMapper.
@rickbria8420Ай бұрын
@@AdamBlock Thanks Adam, just what I wanted to know.
@Maxastro59Ай бұрын
Sorry Adam but I must have missed something. I saw in Fundamentals the whole process for developing M83. However I was quite "shocked" by the lack of calibration process. What happened to the famous dark-fdlat-bias frames? Maybe you explained it somewhere?
@AdamBlockАй бұрын
TelescopeLive provides only calibrated images. Other workflow examples show the calibration (WBPP processing). It is pretty much the same thing each time! :) I will be creating more content with more WBPP tutorials... the point of this section was the LRGB instruction (as well as more secret processing). Everyone always asks me to show what to do AFTER the initial processing... this is what I did.
@Z-addАй бұрын
why don't you use oneshot color image for reference only.
@stef2499Ай бұрын
Why did you not background neutralize this one?
@jackwmesАй бұрын
One thing I find striking is the improvement in sharpness of the ImageBlend result compared to the LRGBCombination. It looks like your L image is sharper than the RGB, so is LRGBCombination not adequately retaining that level of detail in the result?? I thought that was the whole point of the tool!
@Neanderthal75Ай бұрын
Adam, you are solving my biggest gripe so far, regarding LRGB imaging. I avoided taking L-s for the reason that producing an LRGB image was a 50-50 chance turning out good or bad. Eventually I find it to be a waste of time, because I buried the L with the curves and it did nothing to the image. I'm gonna try this method ASAP! Thanks again!
@AdamBlockАй бұрын
Great... I hope this helps. This is exactly what is done in Photoshop... so it should work just fine. All of the behaviors I learned over the years (previous to PI) now have come back into play. :)
@rvoykinАй бұрын
I’ve just been using the color and luminous blend mode in Photoshop, which has been really effective, but curious about trying it this way
@gclaytonyАй бұрын
Interesting new tool Adam, and certainly one to try out. However, a bit underwhelmed by the comparison between the LRGB color channel combination method and the script. One thing that the script rather clearly demonstrates is that there is a need to ensure some form of DBE/gradient removal be done and that color image be color corrected (as would be a normal process) before proceeding to the 'stretch' and subsequent steps. If the component images have garbage, that seems to be ehanced in the combination process. The LRGB color combination image seemed to show an increase in black level and some reduction of the color bias/gradient relative to the script created image (at least as was visible in the KZbin video). How does the script tool work with starless images created with tools like SXT? I normally separate the stars out and work with them separately from the object (except when the object is a globular cluster or something similar). That allows the freedom do work with colors/saturation/contrast without impacting the star color or creating 'bloom'. How does it react when the Lum layer is also used as a mask to prevent changing/impacting the background during the combination process? I like that the tool provides a means to preview the result, something the LRGB channel combination tool lacks. At first glance that seems to be the biggest single benefit, since the LRGB CC tool requires trial and error to get things 'just right'.
@MrStacaz99Ай бұрын
What is going on here? Was something in Pixinsight made EASIER to use?? I really don't understand how this can happen. It runs counter to the entire product philosophy behind Pixinsight!! I may have to delete my installation of it!
@AdamBlockАй бұрын
Ha ha... I have the same issues... look I made a 9 minute video..I am surprised you watched it given its brevity. :)
@GhostSenshiАй бұрын
Very nice. Thank you
@ACKitsBilltheCATАй бұрын
Thanks for this free public teaser! I just watched the entire series in Fundamentals on your website this weekend, and got a lot out of it - highly recommended!
@AdamBlockАй бұрын
Great... Thank you for being a member!
@physmc1Ай бұрын
Amazing work. Looking forward to trying the script on my data.
@AdamBlockАй бұрын
Great... I think you like it.
@DSOImagerАй бұрын
That's cool. I bet this would work well for Ha, LRGB images.
@PhenolisothiocyanateАй бұрын
Hi Adam, I'm very near the end of the Fundamentals path. There's a recurring question that I've never had answered satisfactorily: If you've shot enough RGB that color noise in your target is well controlled then why do you need L? Your SNR is, by definition, already high so what's the benefit? I often see images where L is used to bring out faint details more quickly and the overall structure looks fine. The color, however, is usually mottled or smeared, presumably from insufficient RGB. It appears that the color data gets heavily processed to control noise with tools like Blurxterminator or with convolution. While this can look ok superficially it doesn't look very good under even casual closer inspection. Is there something I'm missing? Thanks!
@AdamBlockАй бұрын
You aren't missing anything. You will note in my explanation for LRGB in my tutorial I talked about why LRGB was useful in the past (time savings through binning to reduce read-noise). However, this isn't true any longer with CMOS. So why did I make the video? Well... people are still doing LRGB! lol I do not think there is a big benefit other than some time savings where it isn't possible to acquire as much color as desired. That is really it I think.
@PhenolisothiocyanateАй бұрын
@@AdamBlock Thanks! As an experiment I processed NGC4565 with a synthetic L composed of integrated RGB masters and then with a true L. The difference was negligible. For bright targets I think I'll shoot pure RGB. For faint stuff like IFN I'll shoot LRGB because some parts are just too faint to get SNR on without 40+ hours of combined RGB. Thanks again! Btw, your section on image calibration helped me solve an issue I was having with my ASI6200MM. 😌
@danielshade710Ай бұрын
I also don’t see any benefit to using real L. Plus I really don’t care to stack 3X the images just to access control of certain portions of the image. There’s a lot of great tools for contrast control
@rafaberrios8142Ай бұрын
This is so awesome. Can I do this with blending lumens from mono with OSC data?
@AdamBlockАй бұрын
Yep. This is open-ended.
@JK-gn9qiАй бұрын
Nice! When in the process do you integrate them? At the start or after first processing both the luminance and rgb?
@AdamBlockАй бұрын
Now...that is question that is answered on my site in my tutorials (you should consider becoming a member!). I am not certain I understand what you mean by "integrate"... you do this after you have permanently stretched them...then you put them together and work on the image from that point. Is this what you mean?
@JK-gn9qiАй бұрын
@@AdamBlock Yes that is what i meant, thank you very much!
@haiderbhogadia4829Ай бұрын
Hi Adam with OSC do you extract a luminance image from the RGB image?
@AdamBlockАй бұрын
You could... but the luminance (lightness) is the same information as when you keep it all togther as a single color image. There isn't a benefit of extracting this information unless you plan to take a wildly different path with it... but I cannot think of a reason right now that would be beneficial.
@DylanODonnellАй бұрын
Thanks Adam!
@AdamBlockАй бұрын
Thanks! I did write you to many times concerning GalaCell... but the e-mails would not go through! Do you have any other means of contact? I did get yours through my site.
@Ekuy1Ай бұрын
looks great! is LRGB combination (native pix version) supposed to be a non-linear process? I've always used it in the linear stage with linearfitted, calibrated and background neutralised images, which would give me a very flat result that I'd manipulate further in processing. This is because I have quite a practiced and developed RGB processing workflow, and I feel more comfortable with using the regular Pix tools for my non-linear processes (curves, masked brightness adjustments etc.). With your tool, would I be able to still do the L+RGB combination in a linear state, not a non-linear one as you showed in the video, so that I can stretch and process them together?
@AdamBlockАй бұрын
The way I demonstrated its usage is how it was designed to be used for this kind of LRGB result. There is a thread about this and an explanation by Juan Conejero to this effect.
@robertocoleschi4025Ай бұрын
Adam provide to us, without any doubts, the best tutorials about image processing wordlwide, from over twenty years (and many of them are free).The investiment in your tutorials are the best expense that an amateur can afford. Thanks a lot, for your time and creativity.
@AdamBlockАй бұрын
Thank you!
@simonpepper5053Ай бұрын
Thanks for being so amazing Adam!
@dirkseidel4531Ай бұрын
Perfect this is what I m looking for LRGB was always difficult for me to combine try different methods thank you Adam for your work greetings from switzerland
@AdamBlockАй бұрын
Thanks for watching!
@starsipsАй бұрын
Amazing Adam! This script is most welcome! Thank you
@AdamBlockАй бұрын
Thanks for watching!
@englishsubplzzaoki3445Ай бұрын
pulsar
@englishsubplzzaoki3445Ай бұрын
pulsar
@ffme33Ай бұрын
Oh wow I can't remember what I did on this date but you did something incredible, thank you
@AdamBlockАй бұрын
Thank you for watching!
@mikehardy8247Ай бұрын
That fireball is bonus. The ISS went through a couple of my frames, which I thought was pretty kool.
@AdamBlockАй бұрын
Excellent! When the fireball happened... my wife (who was behind me) exclaimed "aahhhh!!" So I turned to look at her..... I did capture it in the sequence though. lol
@steveduffy8726Ай бұрын
Like all your images, this one turned out wonderfully well.
@Jekaniah-jm7gqАй бұрын
What a beautiful video. How exciting this must have been for you to see. Thanks for sharing.
@kerrygreen9064Ай бұрын
I will continue to state how unprepared I was for just how extensive this storm was at 37° N. I remember seeing aurora as a child and it was a reddish sky with some rays toward the north. This is what I was expecting. I was completely taken aback by the display. There were curtains, pickets, blobs, corona aurora… The display extend toward the Southern Sky. Utterly amazing and emotionally evocative.