Hacking AI - Extract System Prompts with THIS Technique

Votre vidéo commence dans 10
Passer (5)
La méthode vendre des programmes à 5000 euros et plus

Merci ! Partagez avec vos amis !

Vous avez aimé cette vidéo, merci de votre vote !

Ajoutées by admin
0 Vues
Check out Part 3 - Prompt Injection Methodology for GenAI Application Pentesting - Greet & Repeat Method

While tons of courses dive into pentesting GenAI-powered applications conceptually, we've decided to publish one of the several methodologies we've used at 7Seas and found some great success in with real clients, bug bounties, and challenges across the space! In this video, we're covering a 4-step, foundational methodology that you can use to bypass restrictions applied at the system prompt level right now!

Much more to come! We'll soon cover other topics such as more advanced prompt engineering methodology, scoping guide for GenAI-powered applications, testing for traditional security issues, and more secure database implementations to see what else is possible!

Shoutout to Arcanum for their Taxonomy for Prompt Injection! - https://github.com/Arcanum-Sec/arc_pi_taxonomy

▹ Watch me Live on Twitch - https://twitch.tv/garr_7
▹ Need a pentest or a consultation? - https://7seas-sec.com/
▹ My Discord has more up to date resources for AI - https://discord.gg/ZqRTzeAdtW
▹ My AI Playlist (2 Intro videos) - https://www.youtube.com/playlist?list=PL1GDzLoRwyVBUMlDuCjH2aO2phkCUyE-m

#promptengineering #promptinjection #retrievalaugmentedgeneration #ollama #llm #ai #openai #chatgpt #aisecurity #pentesting
Catégories
prompts ia

Ajouter un commentaire

Commentaires

Soyez le premier à commenter cette vidéo.