Can you make sense of this C pointer code?
I have this snippet of C code that uses pointers in a very confusing way.
// We first point to a specific location within an array.. double* h = &H[9*i]; int line1 = 2*n*i; int line2 = line1+6; // ..and then access elements using that pointer, somehow.. V[line1+0]=h*h; V[line1+1]=h*h + h*h;
What's happening here? how do I write something equivalent in C#?
You don't really write something equivalent in C# because you don't have pointers there (except by invoking unsafe code) - to get an element from a C# array, you need an array ref and an index, and you index into the array.
You can, of course, do the same with a C array. We convert the C pointer-arithmetic into C array-indexing:
int h_index = 9 * i; int line1 = 2 * n * i; int line2 = line1 + 6; V[line1 + 0] = H[h_index] * H[h_index + 1]; V[line1 + 1] = H[h_index] * H[h_index + 4] + H[h_index + 3] * H[h_index + 1];
And then we have something that can be used pretty much verbatim in C#.