You can definitely build some cool features in your app by experimenting with image filters.
@aporvarathi80076 жыл бұрын
EAGLContext is depricated in ios12....!! what we can use instead of this..???
@DSOnur4 жыл бұрын
Great course and teaching skills.
@vivanshinoda97934 жыл бұрын
Great video appreciate ur tutorial to learn coreImage (y)
@NiteshKumar-ty2rm5 жыл бұрын
how to pick most dominant colour from image ?
@newmodeav6 жыл бұрын
is there exposure blending in Core Image filter list?
@ayon35276 жыл бұрын
very nice tutorial
@CodePro6 жыл бұрын
😀
@ashim446 жыл бұрын
how to generate datamatrix barcode using cifilter?
@TheWhistlinWarthog6 жыл бұрын
Hey man, really useful video thanks so much. Im still struggling with implementing core image filters which use CIVectors as inputs, and also ones with multiple input parameters. Specifically trying to figure out how to apply CITemperatureAndTint to an image. Any help would be hugely appreciated :)
@stevengao83456 жыл бұрын
the title may not seem as attractive as the content, this tutorial actually is so cool, first time seen it to deal with the image filtering using iPhone GPU, very glad learned a new thing today. (the time duration 30 mins kinda scary, but once you started watch it, tutorial is so easy and clear to learn, time flyingby so fast). but I do have few questions regarding this topic, when should we apply the filter, can we apply the filter inside the camera which is right after we press take photo button, ()or we should apply it once we select the photo we took by using UIimagepickerview. is this the method that we need to bring the processing on the background: DispatchQueue.global(qos: .background).async {. ..... } . if we are in the middle of processing an image, an user click Back button to exit the view controller, will the iPhone gpu auto stop processing and return memory ? or will easily cause app crash. thanks.
@CodePro6 жыл бұрын
You could apply image filtering at any point really but it depends on your app architecture. You could take a picture using the camera, extract back the image, and the filter it or present the raw image to the user and allow the user to apply filters. You could apply the same logic when selecting images from the photos app. As far as background threading, you could handle the filtering on a background thread and that would probably be a good idea since you don't want to spike the CPU/GPU on the main thread and block UI elements especially if you have multiple images to process.
@stevengao83456 жыл бұрын
watched this video 2.5 times, on your Struct, there are three values, third parameter value -> filtereffectvaluename, why is the string which you entered kcinputradiuskey, isn't we need to use NSnumber to specify how large the radius need to be? or the key string will set the default value for the radius. i am little confused here. thanks
@stevengao83456 жыл бұрын
For instance, i tried to use another pretty cool filter : CIDepthOfField, and the applyfilter method will not change the image, maby there is an key element i missed? Thanks
@CodePro6 жыл бұрын
The third parameter is a string for the name of the key. The actual filterEffectValue was defined as type 'Any' to represent whatever value needs to be provided whether it be a Float, Double, Int, etc.
@CodePro6 жыл бұрын
CIDepthOfField is a little bit more involved and takes in a few more parameters. My guess is you might need to pick somewhere with a combination of inputPoint0 and inputPoint1 in addition to inputRadius. But if the inputRadius by itself appears to have no effect try to modify the code to allow you to add additional parameters or just add the additional parameters to the effect until you start to see some noticeable change happen on the image.
@KakhiKiknadze6 жыл бұрын
it says eaglcontext deprecated in iOS12 :(
@aporvarathi80076 жыл бұрын
how u did dat?
@bdada27594 жыл бұрын
You didn’t start from the beginning so I don’t get it