Well, I'm not sure how much you understand about monitors, so I'll try not to sound patronizing =)
The Hz is the monitor's refresh rate. Most run of the mill monitors have a 60Hz refresh rate, which means the monitor will display up to 60 frames per second, which is the target most gamers attempt to achieve whilst running games. This will mean the game runs silky smooth and has no issues. A 144Hz monitor will display well over 100 frames per second, but only if your graphics card(s) can provide that amount.
So, it would give you the ability to "see" 144 frames per second, but only if your hardware actually runs the game at that amount. If your hardware can only manage 60 frames per second in a game, that's all you'll get. The Hz of the monitor is just a "cap" or limit. So the monitor won't increase your performance, it'll just raise the cap on the frames per second you can perceive.
I note that you're using a GTX 680, which is a high end card. In a lot of games, you would notice a difference over a 60Hz monitor because your card will achieve far more than 60 fps in a lot of games. Typically anything over 100-120Hz and you wouldn't really notice/see it, even if your card achieved it. On the other hand, if you're achieving 100 frames per second on a 60Hz monitor, you'll only see 60 frames per second as the monitor can't refresh the screen any faster than that. There is a noticeable difference between a 60Hz monitor and a 100-120Hz monitor, providing your hardware can provide those framerates. Beyond that, you wouldn't really notice.