Well i got a feeling that im about to help you out quite a lot on this one, so ill be as detailed as i can be..
I use to own the fx 8350 (paired with my current gtx 970 and an asus crosshair V motherboard)
I decided to switch to my current processor (the exact same one you are questioning here)
I have my 5820k now paired with a msi gaming x99 gaming 9 ack motherboard now overclocked at 4.5ghz
I mad the move out of complete curiosity myself as i had never had an intel processor before the day i switched over.
Long story short below
If you decide to make the switch over to the 5820k, expect around double the performance in nearly every single game you own.
Examples below:
Arma 2/3/dayz standalone
Amd framerate for me=30-50fps
Intel=75-100fps (huge towns drop to 50 all the time because of poor optimization in arma)
Battlefield 4
Amd=60-90fps with a mix of ultra settings and low
Intel=110=120fps all ultra with only AA turned off (locked at 120)
GTA V
Amd=50-75fps with mixture of settings between high/low/ultra (grass quality down low)
Intel=75-120fps with the same thing here, all ultra with no AA (i use sweetfx framework and reshade framework in most all games now)
Many other random games i play all the time hover around the same on both setups (yes both had the exact same gpu with the exact same clock speeds, in signature)
I.E. nosgoth/warface/dirty bomb/heroes and generals/killing floor 2/insurgency/etc etc etc (the list continues on with what i have tested)
Most all of these are not to extremely intensive but most of them hover around 50-70 with the amd build and with intel they stay up to 120 with very rare drops to 100
Now onto processor usage.
With amd=i usually hover all over the place with usage on the amd processor (80% to 100% depending on the game, mostly hangs in at 75% with battlefield)
With intel=This one is hilarious but ill explain. While gaming, i have yet to see this processor hit 50% usage under ANY circumstance in ANY game at all (also maintains the performance explained above). While stress testing with aida64 at 1.3 volts at 100% (i hover in right at 70c, pretty sure i got semi lucky with my chip)
Moral of the story here is if you are a streamer/gaming, it is a HUGE no brainer to switch out if you are already thinking about it.
P.S. I used xsplit when i was testing streaming with intel and my motherboard also has hardware encoding (the streaming engine feature) so therefore i lost zero performance while streaming on this chip (if i leave the streaming running while i game, i forget its running all the time)