G-Sync is a proprietary Nvidia technology, you must have a compatible card (Anything in the last few releases should do, 900 and 1000 series, as well as some 700 series cards support it) and a matching monitor.
Free-Sync / Adapative Sync is part of the Display Port standard. AMD graphics cards, going back quite a ways now, support this feature.
What they both essentially accomplish is slaving the Monitor to the GPU so that the GPU controls the refresh rate of the monitor. This means frames are only drawn when the card has a ready frame and only at the beginning of a refresh cycle. This means no screen tearing and smooth gameplay when your GPU can't produce enough frames (effectively).
Regardless of which technology you go with the monitors themselves are still 144hz panels, so you can use V-Sync, which fixes the monitors refresh rate and limits the GPU to only drawing at the beginning of cycle. That can still lead to tearing if your GPU can't keep up.
Or no sync at all. This means the GPU puts out as many frames as it can render. And the monitor grabs whatever is in the frame buffer at the beginning of a cycle. This always leads to tearing, but if the FPS is fast enough it won't be terribly noticeable. The upside is that there is as little input lag as possible since you aren't waiting on anything. You can also then use things like ULMB and Light Boost to get rid of ghosting or increase brightness.
G-sync is more expensive because it is a custom module that runs the monitor. Free-sync still uses the traditional scalar in the monitor, essentially a cheaper software solution.
End result is: If you get a free-sync monitor and have an Nvidia GPU you can't use Free-sync or G-Sync. If you get a G-Sync monitor with an AMD GPU you can't use G-Sync or Free-Sync. But this isn't the end of the world, either configuration will let you use the monitor.