I'm trying to run a program (GCC compiler) that uses a lot of RAM in its work. But unfortunately, it crashes due to a lack of memory, when its process only occupied about 2.2 GiB of virtual memory (according to the Gnome system monitor). I heard that on 32-bit OS, up to 4 GiB can be available to a single process. But what is the reason for such a discrepancy, and is there any way around this limitation? Why the process can not take the memory to the maximum?
This is what I get at the end of GCC.
virtual memory exhausted: Cannot allocate memory But the output of the command ulimit -a .
core file size (blocks, -c) 0 data seg size (kbytes, -d) unlimited scheduling priority (-e) 0 file size (blocks, -f) unlimited pending signals (-i) 16382 max locked memory (kbytes, -l) 64 max memory size (kbytes, -m) unlimited open files (-n) 1024 pipe size (512 bytes, -p) 8 POSIX message queues (bytes, -q) 819200 real-time priority (-r) 0 stack size (kbytes, -s) unlimited cpu time (seconds, -t) unlimited max user processes (-u) unlimited virtual memory (kbytes, -v) unlimited file locks (-x) unlimited I run the program in a chroot cell with a 32-bit environment on a machine with 6 GB of RAM. I need this to test the build for the i386 architecture.