Jupyter Notebook kernel dies

After running the code for sometime, the kernel dies.I have again run the code from start.Is there any solution to this problem.I am running it on binder.

The binder is free service.

Therefore it limits the number of notebooks running.

If you don’t make anything for a longer time, it automatically disconnects you, to free up the resources.

Either don’t make breaks or run jupyter notebooks on your computer.

Thankyou for your suggestion. But even after running it locally, the issue persists.

Maybe your dataset is too large or you are using some parameters for the model which takes huge computer power…that is why it is crashing…would suggest using COLAB or KAGGLE for doing memory intensive operations…

1 Like

I had the same problem with a large dataset running Jupyter Lab on my computer.
The free version of Colab restricts the amount of ram and disk space you can use, so I also had issues there.
Since I was only using the dataset for my own training, I reduced (halved) the size of the csv file before creating a dataframe and then Colab and local Jupyter Lab worked ok.
Reducing the size of the dataset still allowed me to perform sensible machine learning.
Of course, if you’re using the large dataset for professional purposes, then you should buy a Colab license

1 Like

which distribution of python are you using?
I can suggest that you use anaconda. Using Anaconda you get less problems of kernel dying.

I use the Anaconda individual edition which includes Jupyter Lab and I occasionally run into memory issues, which is the fault of my computer.
we are not just using Python, but Pandas, sklearn, matplotlib etc for machine learning, so we analyse and plot data during our training.
Thus we use notebooks and not a plain Python IDE.
For Python and Django I use VSCode as my IDE.

1 Like

If you don’t make anything for a longer time, it automatically disconnects you, to free up the resources.

Either don’t make breaks or run jupyter notebooks on your computer.