Skip to content Skip to sidebar Skip to footer

Jupyter Lab Freezes The Computer When Out Of Ram - How To Prevent It?

I have recently started using Jupyter Lab and my problem is that I work with quite large datasets (usually the dataset itself is approx. 1/4 of my computer RAM). After few transfor

Solution 1:

Absolutely the most robust solution to this problem would be to use Docker containers. You can specify how much memory to allocate to Jupyter, and if the container runs out of memory it's simply not a big deal (just remember to save frequently, but that goes without saying).

This blog will get you most of the way there. There are also some decent instructions setting up Jupyter Lab from one of the freely available, officially maintained, Jupyter images here:

https://medium.com/fundbox-engineering/overview-d3759e83969c

and then you can modify the docker run command as described in the tutorial as (e.g. for 3GB):

docker run --memory 3g <other docker run args from tutorial here>

For syntax on the docker memory options, see this question:

What unit does the docker run "--memory" option expect?

Solution 2:

If you are using a Linux based OS, check out OOM killers, you can get information from here. I don't know the details for Windows.

You can use earlyoom. It can be configured as you wish, e.g. earlyoom -s 90 -m 15 will start the earlyoom and when swap size is less than %90 and memory is less than %15, it will kill the process that causes OOM and prevent the whole system to freeze. You can also configure the priority of the processes.

Solution 3:

I also work with very large datasets (3GB) on Jupyter Lab and have been experiencing the same issue on Labs. It's unclear if you need to maintain access to the pre-transformed data, if not, I've started using del of unused large dataframe variables if I don't need them. del removes variables from your memory. Edit** : there a multiple possibilities for the issue I'm encountering. I encounter this more often when I'm using a remote jupyter instance, and in spyder as well when I'm perfoming large transformations.

e.g.

df = pd.read('some_giant_dataframe') # or whatever your import is
new_df = my_transform(df)
del df# if unneeded.

Jakes you may also find this thread on large data workflows helpful. I've been looking into Dask to help with memory storage.

I've noticed in spyder and jupyter that the freezeup will usually happen when working in another console while a large memory console runs. As to why it just freezes up instead of crashing out, I think this has something to do with the kernel. There are a couple memory issues open in the IPython github - #10082 and #10117 seem most relevant. One user here suggest disabling tab completion in jedi or updating jedi.

In 10117 they propose checking the output of get_ipython().history_manager.db_log_output. I have the same issues and my setting is correct, but it's worth checking

Solution 4:

You can also use notebooks in the cloud also, such as Google Colab here. They have provided facility for recommended RAMs and support for Jupyter notebook is by default.

Solution 5:

I am going to summarize the answers from the following question. You can limit the memory usage of your programm. In the following this will be the function ram_intense_foo(). Before calling that you need to call the function limit_memory(10)

import resource
import platform
import sys
import numpy as np 

defmemory_limit(percent_of_free):
    soft, hard = resource.getrlimit(resource.RLIMIT_AS)
    resource.setrlimit(resource.RLIMIT_AS, (get_memory() * 1024 * percent_of_free / 100, hard))

defget_memory():
    withopen('/proc/meminfo', 'r') as mem:
        free_memory = 0for i in mem:
            sline = i.split()
            ifstr(sline[0]) == 'MemAvailable:':
                free_memory = int(sline[1])
                breakreturn free_memory

defram_intense_foo(a,b):
    A = np.random.rand(a,b)
    return A.T@A

if __name__ == '__main__':
    memory_limit(95)
    try:
        temp = ram_intense_foo(4000,10000)
        print(temp.shape)
    except MemoryError:
        sys.stderr.write('\n\nERROR: Memory Exception\n')
        sys.exit(1)

Post a Comment for "Jupyter Lab Freezes The Computer When Out Of Ram - How To Prevent It?"