Could someone explain what is the difference between GPU memory and normal(?) memory? Since when we write a CUDA program, we need to allocate memory again.
@althalus What do you mean by "normal" memory? Do you mean CPU memory or the main memory? I think GPU memory is just some space to store data on GPU. We allocate memory because we need to store our data on GPU. So we allocate some space on it using its interface. In our code, we can't do anything with the space we allocate because the address space is on GPU instead of main memory. So if there is any difference, maybe different address space is one?
@monkeyking thanks for the clarification. I did mean main memory and I got a lot of clarification from the address space discussion somewhere further down in the lecture. But, yes you are right, the different address spaces is something to remember when we program in CUDA.