Shocking New Joe Rogan Clip is Spreading on TikTok…




Clip from Lew Later (The TRUE Cost of the iPhone 14 Pro Max…) – https://youtube.com/live/gAWp8g6kKsE.

source

Recommended For You

About the Author: LaterClips

37 Comments

  1. They will just create a verification system for this too. Once people realise deepfakes are a thing they will think twice before buying stuff.

  2. i would say that people who routinely watch JRE would know something was up cause I have never seen graphics being used like that. To people who don't watch, could easily be fooled. Humans will use any means necessary to acquire money or just be a douche.

  3. Podcasters/Vloggers are gonna have to trademark/copyright their voices now, because of these deep fakes.
    In fact, every person on the planet may have to do that.
    So many ways that deep fake AI voice can implicate you in something or make anyone "believe" that you are alive, when you may not. Not kidding!!

  4. My opinion is the solution is going to have to be some form of embedding from the original and verified channel. That way people know the content is the same as the original produced.

  5. Man that doesn't sound real at all.
    If you know anything about Now Rogan and internet advertising you know first off this doesn't sound remotely like him. He almost never pulls up image content so quickly without asking Jamie first to put it on screen. That's just not his or Jamie's flow…
    Also he doesn't talk like that. This just isn't his cadence, nor does he speak like this

  6. The only people that believe this are people that don't watch his (JRE) podcast, but if they don't watch his podcast why would they actually listen to what he's telling them to buy. The only people hurt by this are morons.

  7. So I'm thinking to myself, the best way to combat this would be for each content creator to create their own specific site or people only being able to access the content while subscribed to specific channels. This is where copyright needs to come in for the better – to remove any video uploaded by someone else that isn't theirs. I'm playing devil's advocate here but it would be the best way to fight it. Users need to start paying attention to who is uploading the video.

  8. Those of us paying attention to deepfakes when they first appeared, knew this was coming the moment we saw the first one. However, most of society is definitely not ready to discern the difference. This is the new "let us wire you money, we just need your bank account number."

  9. One potential way this can be fixed or avoided is by using blockchain. A QR code to scan on screen could be used to verify the authenticity of videos is by using a digital fingerprinting or watermarking technology to create a unique identifier for each video. This identifier could then be stored on a blockchain, allowing for quick and easy verification of the video's authenticity

  10. No wayyyy… no sane person will fall for this crap… having said that… this is the sort of misinformation that social media needs to focus on flagging

  11. I'm glad Lew pointed out that it's not ALL misinformation. Labeling all misinformation actually dismisses how complex this problem really is. Even in a controlled environment, if all if the information was objectively "good" information (for a lack of a better term) and unimpeded by the powers that be….people have different consumption behaviors, interests and locations which results in consuming DIFFERENT information. This alone it's a serious challenge, throwing governments and agenda's into the mix only exacerbates the problem.

  12. 4:15 ETERNITY…. VOID…. NEXT… nweh, Whew, whheew, whoowh, hehwwhh, WHWHAHAEEHH 🤣🤣🤣🤣😂😂😂😂😂😂😂🤣🤣🤣🤣🤣🤣🤣🤣😂😂😂😂😂😂😂😂😂 It totally could be like that, in this world at this time, ABSOLUTELY 🤣🤣🤣🤣🤣🤣🤣

  13. the way to beat the bots is to be extremely unPC… tread where they're programmed not to go

    … use constant racial slurs, gender put downs, etc… every other word, to verify authenticity, of course

    (ps. this will be the plot for Terminator Infinity ♾️… and I was surprised how many there are, actually)

Comments are closed.