I dived into the world of ML using scikit-learn and now I am learning the Tensorflow. I searched alot about the deployment of models, but I am having a hard times understanding the whole meachnaism. I really appreciate your effort, this is the best content on ML deployments on KZbin 👍🏻
@radosccsi7 жыл бұрын
I made a model in Keras. Installed Keras and Tensorflow on AWS instance in Virtualenv and created single python instance listening to RabbitMQ with Pika and used Flask over WSGI to put messages to the queue. HTML client uploads a photo and is returned with ID than it should request id info from the server in one second intervals. Works fine and queuing is kind of bullet proof since it's running on a small cpu instance :)
@nourhacker37347 жыл бұрын
Hey rad, sounds very interesting. Where do I learn how to do this?
@altairpearl7 жыл бұрын
rad rabbitMQ. I have heard about it and thought of using it .
@SirajRaval7 жыл бұрын
very cool
@shreyanshvalentino7 жыл бұрын
That's awesome!
@arjunsinghyadav42737 жыл бұрын
Hey Siraj, Firstly, great video Request: A tutorial on how to build a deployed Deep learning model that learns from live data and updates itself to a new version.
@25002045 жыл бұрын
just load the model and do model.fit(new data) and then overwrite the file using model.save() or whatever save function your are using. Incremental Learning is the best solution for continuously updating models with new data.
@bharatsahu15994 жыл бұрын
@Shashwat don't you think it will take so much time to retrain with new data included and user won't be waiting till infinity for results.
@q-leveldesign53427 жыл бұрын
Thank you, I have been wondering what to do with a model once trained. No one seems to be talking about this and it seems like a very important step. And yes, I have been searching furiously to figure it out. Thanks again.
@SirajRaval7 жыл бұрын
np
@vijayabhaskar-j7 жыл бұрын
I always wondered "Ok, I created a model, now what?". Thanks, Siraj!
@michaelbell60556 жыл бұрын
Siraj... my dude, yours are the shoulders I am standing on in my job. Thank you so much for all the incredible tutorials and additional resources!!!
@jijojohn51687 жыл бұрын
Long story short siraj earned around 864.84 dollars for this month lol go to 35:40.. He deserves lot more.. Keep up the good work.
@stephk83167 жыл бұрын
jijo john not bad for a side job, and well deserved!
@tamgaming98617 жыл бұрын
He deserves a lot more - i wish him the best!
@SirajRaval7 жыл бұрын
ha! that slipped through. cool. i'll keep it there. transparency ftw
@chicken61807 жыл бұрын
i mean, does he not deserve it?
@theempire007 жыл бұрын
Damn, imagine what those youtubers with millions of followers earn...
@angelomenezes60447 жыл бұрын
Man, you are really underrated! You deserve a lot for these great videos about ML. A big thanks from Brazil for the awesome work!!!
@adamyatripathi27437 жыл бұрын
His notebook is Untitled... He chose the dark path....
@SirajRaval7 жыл бұрын
renamed it to demo now, so much more content coming
@adamyatripathi27437 жыл бұрын
Siraj Raval Your videos are good! May the force be with you...
@breakdancerQ5 жыл бұрын
@@adamyatripathi2743 Naming notebooks is for noobs
@KelvinMeeks6 жыл бұрын
Siraj, excellent tutorial - thanks for creating this.
@theempire007 жыл бұрын
24:18 When I run the command: 'docker build --pull -t $USER/tensorflow-serving-devel -f tensorflow_serving/tools/docker/Dockerfile.devel .' I get an error: 'invalid argument "/tensorflow-serving-devel" for t: invalid reference format' Help? (On Windows 7, Docker Toolbox) UPDATE: The following does work: 'docker build --pull -t tensorflow-serving-devel -f tensorflow_serving/tools/docker/Dockerfile.devel .'
@yassineelb87356 жыл бұрын
just ommit $USER/
@mercolani16 жыл бұрын
Loved the video, love the energy, he clearly has a deep understanding
@andresvourakis68807 жыл бұрын
Your explanation was on point!! Thank you Siraj
@SirajRaval7 жыл бұрын
np
7 жыл бұрын
Love your teaching :) Keep it up☺
@SirajRaval7 жыл бұрын
thx
@gattra5 жыл бұрын
Please rehearse more and these would be 10000% better
@svin305357 жыл бұрын
Great topic! Thanks Siraj.
@SirajRaval7 жыл бұрын
np
@abhiwins1237 жыл бұрын
Thanks for end 2end tensor flow tutorial. World wows you for AI revolution
@SirajRaval7 жыл бұрын
awesome thx
@aug_st7 жыл бұрын
Very useful. Thanks Siraj!
@SirajRaval7 жыл бұрын
np
@AbhishekKrSingh-ls5xu7 жыл бұрын
Hey Siraj, Firstly, great video Request: Can u post a tutorial on tensorflow distributed training on GPUs and Kubernates.
@Hustada7 жыл бұрын
Thanks for sharing this. I've been wondering how to do this.
@genricandothers7 жыл бұрын
I barely ever comment on videos but I have got to show love for all I've learned on your channel. I've been recommending you to everyone I can find. What software do you use to do the screen background with you in the foreground by the way? I want to start a channel teaching atmospheric science and I like this style...
@Oneillphotographyithaca16 жыл бұрын
So cool! This is inspiring me to make some models. :)
@ProfessionalTycoons5 жыл бұрын
very good video
@harshmunshi63627 жыл бұрын
I guess you have shared enough knowledge for someone to start a company :/
@SirajRaval7 жыл бұрын
yup
@ttwan6906 жыл бұрын
May the force be with you
@afshananwarali94626 жыл бұрын
Thanks for this. It works for me.
@xtr33me7 жыл бұрын
Thanks so much for this vid! Could you by chance in the future do the same thing, but for something custom like a tensorflow model that simply adds two floats and returns the response? Reason I ask is because I have been having a big problem trying to figure out how to setup a custom model for serving with regards to configuring the proto files and client.
@JabaBanik7 жыл бұрын
This is amazing, thanks Siraj. Since we are talking about production level can you plz suggest server configuration required for Tensorflow serving?
@captainwalter4 жыл бұрын
I honestly dont get how to employ the model. At what stage do we use the neural net to make decisions about actionable data, in this case see it decode the words?
@larryteslaspacexboringlawr7397 жыл бұрын
thank you for tensorflow video
@SirajRaval7 жыл бұрын
np
@AlienService7 жыл бұрын
Thank you for these. I've learned a lot already. The big question and use case that I'm interested in is using ML in blender. The goal would be to create a blender add on that could be trained on and manipulate mesh in a character model. With Blender and its add ons all written in python, this seems doable. The mesh data can be called within the blender python api pretty easily. My question is how to best set up a system that would take a character mesh (this would be in the thousands of vertex coordinates) and then train a model on with shape keys for happy in each one, then be able to make a shape key on a new character mesh that also produces a happy expression.
@igorpoletaev81887 жыл бұрын
I was very surprised by the fact that bazel have been building my custom client for serving for a very long time ...Does it need to compile so many sources every time when I change the client code?
@thoughtsmithinnovation54327 жыл бұрын
Hi Siraj, you mentioned at 28:00 that inception has 100s of layers. If I am not wrong presently it has only 48 layers. Please correct me if I am wrong or you are referring something else.
@wasimnadaf115 жыл бұрын
super informative:)
@eliassocrates3387 жыл бұрын
Siraj, could you please upload weights of models you trained as well, as neither online and personalized training of models is a viable option financially.
@CKSLAFE6 жыл бұрын
So sad this tutorial is broken now, they changed the github repository. Now you don’t have the tensorflow folder inside serving. If anybody knows of a tutorial please let me know.
@afshananwarali94626 жыл бұрын
There is no tensorflow folder inside of serving on github. What should I do?
@lakrounisanaa91566 жыл бұрын
hi what do you do in this case i face the same issue
@iulia21906 жыл бұрын
try to build in serving directory
@jenlee66936 жыл бұрын
You can do 'bazel build -c opt tensorflow_serving/...' at tensorflow-serving directory in docker container.
@FZ8Yamaha6 жыл бұрын
According to github.com/tensorflow/serving/issues/755 , looks like we can just skip the cd tensorflow and ./configure steps
@kevinwong3226 жыл бұрын
such a helpful video!
@phurien7 жыл бұрын
Hey Siraj, Love the videos. Question: I am taking the Udacity DL course, and am getting more and more into it and plan to continue on to make a career out of this. Would you recommend I switch over to Ubuntu as my primary OS or is it feasible to stay in Windows?
@shivajidutta84727 жыл бұрын
I think an alternate would be deploy the models in your code directly rather than calling a rest API. I have a model running on my iPhone, I don't see performance issues. The new chipsets are getting more and more powerful.
@jagdeepsihota66477 жыл бұрын
Can you please share either blog or video on steps you took to deploy to iphone. Thank you
@SirajRaval7 жыл бұрын
share github!
@shreyanshvalentino7 жыл бұрын
share, please!
@xPROxSNIPExMW2xPOWER7 жыл бұрын
lol need this in about two weeks thanks for a dank upload siraj!!!! really hope I dont run into that docker problem you had, I have over 20 docker images I think. lol 27:00 building custom linux kernels amirite lol
@SirajRaval7 жыл бұрын
dope u will do fine
@ShepardEffekt7 жыл бұрын
Was waiting for this
@SirajRaval7 жыл бұрын
dope
@machartpierre7 жыл бұрын
Hey Siraj! Thanks a lot for all this amazing content. I am working on generative models for symbolic (MIDI) music sequences. Your videos on the topic have been very useful. However, I'm intending on running the inference / generation part on mobile device (iOS). I am using TensorFlow and things seem to gradually improve (more functions, more support, more documentation) but I still find it very tricky to port the model on device (strip the unused / unsupported nodes, optimize, porting the generation scripts etc.). Even porting the fairly simple RBM model you used for one of your videos is challenging. Any suggestion on that? Given that running inference on mobile devices is becoming a trend, would you care to make a video about it?
@bibhu_1076 жыл бұрын
To build the docker file use : sudo docker build --pull -t $USER/tensorflow-serving-devel -f tensorflow_serving/tools/docker/Dockerfile.devel . To run : sudo docker run --name=tensorflow_container -it $USER/tensorflow-serving-devel
@st0ox5 жыл бұрын
"we have to deal with C++" count me in :DD
@moelgendy_7 жыл бұрын
Great video, Siraj! Could you add resources on how to deploy Keras models?
@tonydenion35577 жыл бұрын
Nice vid man ! Did you like C (didnt see any vids about it :D). I would like to know more about Tensorflow C API. Thanks alot for all knowledge you share
@cameronfraser41367 жыл бұрын
My understanding is the tensorflow C api wasnt designed to be used for production directly. If you want to deploy a model in C/C++ consider writing it from scratch, its not as bad as it sounds (inference is much simpler than training). Deep networks are mostly just a series of matrix multiplies.
@SirajRaval7 жыл бұрын
more tf vids coming thx
@tonydenion35577 жыл бұрын
ty for answer, world gonna change thanks to guys like you ;)
@afshananwarali94626 жыл бұрын
Please share link of part 2 of this tutorial for pushing this to cloud.
@Superjeka19797 жыл бұрын
Hi Siraj, nice video! But I'm a bit confused about classification_signature and predict_signature in MNIST example. Should I use both of them, is there any difference between them, why classification's input is a string, etc. Or it's just example that I can use number of signatures to query single model? Thank you.
@simpleman50987 жыл бұрын
Hey Siraj, what software do you use to make those images like on 2:34 or 11:46 etc?
@themakeinfo7 жыл бұрын
Hi @siraj, Could you please tell How to Deploy a Keras Model to Production?
@adesojialu10513 жыл бұрын
i am working o n image classificattion and my model is in tflite, how do i deploy? do i need to change anything in your video tutorial?
@sig78134 жыл бұрын
If I use a saved scaler function from sklearn for the input data - can that be loaded to the server along with the model? Basically before model is called - i have to use that function first on every input. I had to use a scaler since i have many inputs and they are very different : one can be in a range of 1-3, another 50000-1000000. For that i used StandardScaler from sklearn and it does great. In case of getting right prediction i have to apply it on the new coming data.
@drdeath26674 жыл бұрын
cardigan lol. inception network is savage
@abdelhaktali6 жыл бұрын
Hi Siraj I have trained the keras model using imagdedatagenerator and flow_from_directory. When I deploy in tensorflow servimg i got wrong class due to shuffle true in flow_from_directory. How can i resolve this problem ? Thanks
@jenlee66936 жыл бұрын
There is no /tensorflow folder to do 'configure' as Google has taken it out. It is no longer required to do the configure according to Google latest issue response. Just do 'bazel build -c opt tensorflow_serving/...' at tensorflow-serving directory. (of course without the ')
@deepanshuchoudhary45984 жыл бұрын
Come back buddy, we miss you!
@MrSanselvan6 жыл бұрын
@Siraj : Can we train the models and deploy them Incremental ?. Is TF Serving supports multiple smaller models. If yes, how can we do it. I cannot get any help in internet.
@bhushanvernekar51217 жыл бұрын
i am not able to find step by step procedure to how to work on tensorflow in android studio
@pietart35966 жыл бұрын
Stupid question: Are we using the MNIST model? Since we're using the ImageNet model right?
@jenlee66936 жыл бұрын
after uncompress the inception model, do --> 'bazel-bin/tensorflow_serving/example/inception_saved_model --checkpoint_dir=inception-v3 --output_dir=inception-export' as the command on the tutorial is old and no longer works.
@sandhyakale90544 жыл бұрын
Why we want to train the model.. I want deploy in website my chatbot.. Can you tell me
@bhisal6 жыл бұрын
What’s the advantage of serving model using TF serving compared to a rest api
@LeksaJ47 жыл бұрын
Hi Siraj, thank you so much for the videos. bazel build failed on some error and I am gonna try it tomorrow (it might be problem with not enough memory for docker). However I am kinda lost with docker and containers. Now when I shut it down, how do I get back to the step where I can write bazel bild etc..? Thank you.
@fabregas12917 жыл бұрын
Hi, How could we use this approach of deploying a TensorFlow model to production, for a re-trained inception model using transfer learning?
@MrKemusa6 жыл бұрын
How would one go from building tensorflow in docker on a local CPU without CUDA support and then deploying the container to a GPU instance in the cloud with CUDA support? Would I need to build tensorflow again when I deploy the docker container to the GPU and just enable CUDA support there? Or is there a way to have CUDA support on my CPU and maintain that when I deploy the container?
@souuu426 жыл бұрын
the process crashes when i try to create the docker image, it goes on for about 10 minutes and then everything freezes. any idea why ? i have an intel i5 processor
@EpicMicky3005 жыл бұрын
what's the difference between a docker image and a simple executable file?
@saitaro7 жыл бұрын
Siraj, if I wanna write an ML algorithm and make a web app based on it, would learning Django be useful for this task?
@yashsrivastava6776 жыл бұрын
How can one do incremental training of models already deployed to serving?
@debu2in4 жыл бұрын
I think once you have accumulated the data, you can wrap the phases of the model training steps in functions then those functions in a class and trigger the class to train the model, persist the model on the disk and save the path in db, atleast this is how I do it :)
@bilalchandio13 жыл бұрын
I am having issue while deploying my deep learning model in h5 format on flask. It works fine on local machine however, it has issues on my pythoneverywhere hosting server.
@bilalchandio13 жыл бұрын
It basically asks for GPU.
@heathervica11087 жыл бұрын
Awesomeeeeee. Hello guys, do you know if is possible using: • Variational Autoencoders Neural Network (VAE) or • Generative adversarial networks (GANs) For structured data? I have seen some examples and it could be used but just for unstructured data such as images, audio, etc. Maybe do you have any example with structured data? Thanks a lot
@kariuki66447 жыл бұрын
Where would i be without you?
@SirajRaval7 жыл бұрын
love u
@matrixzoo84346 жыл бұрын
Does this mean that in order to make an ML web app I don't have to learn Django or any other python web framework, I could just use tensorflow?
@chicken61807 жыл бұрын
ok ive been convinced.... i will stop being a stubborn js scrub... *sigh* welp time to learn tf
@SirajRaval7 жыл бұрын
i made a js video called evolutionary tetris AI last week! check it out
@chicken61807 жыл бұрын
i know, i saw it. but as the majority of videos are in python it's working against me to be stubborn and not use that mainly
@600baller6 жыл бұрын
If I have an existing tf model, and I trained my data with train_test_split, what to do if I want to see the predictions for my model on the entire dataset (including the original training and testing data)?
@sathyasarathi907 жыл бұрын
Siraj, I wonder if a similar strategy can be used to deploy a sci-kit learn model?
When I run "bazel build -c .." , I get "no targets found beneath' tensorflow_serving' ".
@udaysah80385 жыл бұрын
I am currently facing a problem to deploy my custom models where my images data is located on my local computer, can u make a video to how to deploy custom models where image data is located in the local computer, save models and deploy for in android devices.
@Vijaykumar-jx8jq5 жыл бұрын
Hey siraj, actually i want to know that i have created a image classifier in docker and now i want to integrate into system which is written in python, how i can do that?
@alexp56937 жыл бұрын
Hello. I hope you will answer as it's really important for me. I'm currently working on a project and my task is to generate meaningful unique text from a set of keywords. It doesn't need to be large, at least a couple of sentences. I'm pretty sure I have to use LSTM but I can not find any good examples of generation of meaningful texts. I saw a few of randomly generated but that's all. I would be grateful for any advice. Thank you in advance.
@justinviola24795 жыл бұрын
How can we take that JSON output and have it display bounding boxes in the browser?
@theophilusananias14167 жыл бұрын
Siraj, Please, put together a video tutorial on how to generate an Image from Text with TensorFlow. (Text to Image)
@subhankarbhattacharya29404 жыл бұрын
The day he can show proficiency in linear Algebra and differential equations etc, I would consider him to be a data scientist .. otherwise it’s all smartness practiced with code available in public
@hussain57557 жыл бұрын
Siraj can you please please recommend me a book to get start on ML, your videos are great but I am having hard time in grasping the concept
@SirajRaval7 жыл бұрын
deep learning by bengio
@lotfiraghib70297 жыл бұрын
Hello Siraj, Firstly thank you for this great video. I train a model in Python, than i saved with the train.saver to generate my checkpoint. i want to load this model in C++ , is there a way to do that ????
@AaronSarkissian7 жыл бұрын
I don't get this part: 32:08 How that bazel command worked out of the docker?
@prarthana11226 жыл бұрын
same question...the bazel command didnt work in my docker too..How did he do ..could you please tell us siraj
@akashtripathi59477 жыл бұрын
Can you please explain how I can make and serve CNN model using deeplearning 4j in java ?
@rociogarcialuque69884 жыл бұрын
"If Google can use it, we can use it." is so 2017.
@bibhu_1076 жыл бұрын
#update The tensorflow submodule has been removed. You should no longer have to run TensorFlow's configure script manually
@ChicagoBob1237 жыл бұрын
Not really helping. Thought it would. I have seen SEVERAL videos online about training, how to train etc. BUT if I have a mini PC onboard a device that I want to use the trained data. HOW do I do that? What does training produce? There is NO CLARITY in any of the videos on that ACTUAL workflow of what is produced by training on how to re-purpose the results across platforms. Lets take cars for example. Small computers on board. HOW do I run a trained CNN for detection of road signs? I dont want to train it I just want to RUN it because it will go on thousands of cars. Do you have ANYTHING that helps to understand how to train create something and then LOAD and use the trained data on a machine? NOT a web connected thing
@mockingbird38096 жыл бұрын
Hey Siraj , you did'nt teach what I want I want to know how to deploy a model into web if you have that video please share me or Please Replay my how to do that
@wafaayad58997 жыл бұрын
failed: gcc failed: error executing command /usr/bin/gcc -U_FORTIFY_SOURCE -fstack-protector -Wall -B/usr/bin -B/usr/bin -Wunused-but-set-parameter -Wno-free-nonheap-object -fno-omit-frame-pointer -g0 -O2 '-D_FORTIFY_SOURCE=1' -DNDEBUG ... (remaining 106 argument(s) skipped): com.google.devtools.build.lib.shell.BadExitStatusException: Process exited with status 4. :( I can't get rid of this error, I'm new in ML/DL and that's what I get as my welcoming message, can anyone help please ?
@fabregas12917 жыл бұрын
You are likely running out of memory. Try reducing number of parallel builds by passing '--local_resources 2048,.5,1.0', which would instruct bazel to spawn no more than one compiler process at the time.
@zoranrazarac7 жыл бұрын
bazel-bin/tensorflow_serving/example/inception_export: No such file or directory Now what?
@johnnychan67557 жыл бұрын
Has anyone got an error like this, at the bazel build step? (run on Macbook Pro, OSX 10.11.6, via Docker method. With bazel 0.5.4 in Dockerfile) ERROR: /root/.cache/bazel/_bazel_root/f8d1071c69ea316497c31e40fe01608c/external/org_tensorflow/tensorflow/core/kernels/BUILD:2904:1: C++ compilation of rule '@org_tensorflow//tensorflow/core/kernels:conv_ops' failed (Exit 4). gcc: internal compiler error: Killed (program cc1plus)
@johnnychan67557 жыл бұрын
Solved! See this GitHub issue thread - scroll down. github.com/tensorflow/serving/issues/227
@charlesaydin29666 жыл бұрын
Thanks a lot!
@vibhanshusharma31507 жыл бұрын
Any video on image localisation
@charrystheodorakopoulos48434 жыл бұрын
hi @Siraj, great video, Please help with if you can, I want to convert a tf model(only .pb file is obtained) into a tflite model Whlie writing this script: import tensorflow as tf converter = tf.lite.TFLiteConverter.from_saved_model("C:/tmp/") tfl_model = converter.convert() open("converted_model.lite", "wb").write(tfl_model) i get this error: Traceback (most recent call last): File "C:/Users/user/PycharmProjects/converter/convert.py", line 4, in tfl_model = converter.convert() File "C:\Users\user\PycharmProjects\converter\venv\lib\site-packages\tensorflow_core\lite\python\lite.py", line 400, in convert raise ValueError("This converter can only convert a single " ValueError: This converter can only convert a single ConcreteFunction. Converting multiple functions is under development. While writing a command line: tflite_convert --saved_model_dir=C:/tmp --output_file=saved-model.tflite --input_format=TENSORFLOW_GRAPHDEF --output_format=TFLITE --input_shape=1,416,416,3 --input_array=input \ --output_array=output \ --inference_type=FLOAT \ --input_data_type=FLOAT i get this error: RuntimeError: MetaGraphDef associated with tags serve could not be found in SavedModel. To inspect available tag-sets in the SavedModel, please use the SavedModel CLI: `saved_model_cli`
@adesojialu10513 жыл бұрын
pls can i have a copy of your pipeline or pls how do i do mine?
@shreyanshvalentino7 жыл бұрын
the only useful video that you have uploaded till date!
@SirajRaval7 жыл бұрын
thx what else would be useful?
@shreyanshvalentino7 жыл бұрын
I was probably too excited when I typed that, hence the exaggeration ! You probably don't want to have suggestions from a crappy coder, like me However as much as I love your other tutorial videos, which are informative too, but are restricted to jupyter notebooks There is no way to send across the information processed from that to anywhere which a common person can use I started learning Django and rabbitMQ, with thoughts that only it can provide an interface to tensorflow
@shreyanshvalentino7 жыл бұрын
Also I am not sure if we have used the mnist - numerical recognition classifier in your docker Why did we not use that and instead use inception? Edit - no need to answer, got answered at 29:48
@MrKemusa6 жыл бұрын
Something else that could be useful if you can make videos that showcase how to tailor out of the box tutorials (e.g. the MNIST tutorial) to a completely different use case where there model is still useful (e.g. something with a dataset we've built from scratch). Sometimes there's friction going from these templates to your own use case. Eventually I figure it out but I would be nice to have key things to consider when going from one use case to the next.
@Neonb885 жыл бұрын
If you want more detailed tutorials, look at Melvin L. He's really good with step-by-step solutions
@limyohwan4 жыл бұрын
on docker build --pull -t tensorflow-serving-devel -f tensorflow_serving/tools/docker/Dockerfile.devel i get docker build requires exactly 1 argument
@RowdyReview5 жыл бұрын
Hi Siraj, Thanks for great video. please help me out to fix the issue,I have my own model. here i am using faster_rcnn_inception_v2_pets.config architecture. currently i have trained check points. But when ever i am exporting checkpoints by using below command bazel-bin/tensorflow_serving/example/inception_saved_model --checkpoint_dir=my-model6 --export_dir=inception-export at that time i am getting below error DataLossError (see above for traceback): Unable to open table file my-model6/model.ckpt-21292: Data loss: not an sstable (bad magic number): perhaps your file is in a different file format and you need to use a different restore operator? [[Node: save/RestoreV2_34 = RestoreV2[dtypes=[DT_FLOAT], _device="/job:localhost/replica:0/task:0/device:CPU:0"](_arg_save/Const_0_0, save/RestoreV2_34/tensor_names, save/RestoreV2_34/shape_and_slices)]] Here we have TF=1.4 and Bazel=0.5.4 while training i got checkpoints like model.ckpt-21292.data-00000-of-00001 model.ckpt-21292.meta model.ckpt-21292.index for the above checkpoints i was renamed like model.ckpt-21292. I was followed your video, your downloading pre-trained model. but my question is we both having the same type of checkpoints, then why am getting above error?? Thank you
@RowdyReview5 жыл бұрын
I found solution.,.,.,. Hello all, just follow the below video and export your own model with in a 10 seconds kzbin.info/www/bejne/rWGok6aYr5x7j6M