Hi Dear Support Team,
I noticed that after I loaded the CSV file into a dataframe using pd.read_csv(…), the first time it shows up the content successfully when I execute it in a cell (by just typing the name of dataframe in a cell).
However, the second time that I want to see the content of that same dataframe (let’s say I want to see the content after using dropna(…) or any other data cleaning), by just typing the name of the same dataframe in a cell, it goes into an indefinitely long running cycle without showing the content that used to display for the first time …!! (according to the following screenshot)
And finally after long time running it shows the message (attached the screenshot):
“Your session crashed after using all available RAM. If you are interested in access to high-RAM runtimes, you may want to check out [Colab Pro].”
This is the size of my Training Set:
Training data size: (307511, 122)
I wonder if there as any workaround around this without truncating the volume of my data.
May you please let me know how to resolve the issue.