My gawds, some people need to learn what's a homage and also stop being upset on behalf of others. This comic is fine, stop bellyaching. This is what terminal permission culture does to a motherfucker.
The only person who should care about anything other than the quality is Randall. However since he licensed it CC BY-NC 2.5 how he feels about it doesn't really matter either.
if they make it almost exactly the same and βcreditβ it in the smallest font possible and didnβt get permission from the original authorβ¦ i would say thatβs definitely a ripoff
In a version that doesnβt even fully make sense. With databases there is a well-defined way to sanitize your inputs so arbitrary commands canβt be run like in the xkcd comic. But with AI itβs not even clear how to avoid all of these kinds of problems, so the chiding at the end doesnβt really make sense. If anything the person should be saying βI hope you learned not to use AI for thisβ.
I have a colleague who is trying hard to do it, but it isn't good enough yet fortunately. I point out as many issues as I can to deter him but it ain't working.
More like "And I hope you learned not to trust the wellbeing and education of the children entrusted to you to a program that's not capable of doing either."
TBF it is one of many incidents that have brought more attention to databases used by government institutions that cannot handle NULL as a string. Another instance involved a man with the last name Null who was getting tickets from multiple vehicles he didn't own and states he didn't live in, because whenever the name field was left empty it went to NULL.
It's really not a citizens fault when the system breaks so easily.
I think it's a good thing if attention is paid to these things. However if I remember correctly it took quite a toll on the guy dealing with all the government BS.
(That's why it wasn't funny.)
Meta: Writing stuff online is quite something. When I wrote my comment I did not even consider that it could be interpreted as me blaming the guy for challenging bad government it systems. In a conversation this could easily be rectified with a quick exchange. ( I could have picked up the expression on the other persons face. ) All kinds of context clues are missing. Also I don't know if this will be actually read by all people who passed by and also understood it that way. I am glad that it wasn't something serious.
LLM system input is unsanitizable, according to NVidia:
The control-data plane confusion inherent in current LLMs means that prompt injection attacks are common, cannot be effectively mitigated, and enable malicious users to take control of the LLM and force it to produce arbitrary malicious outputs with a very high likelihood of success.
One of the best things ever about LLMs is how you can give them absolute bullshit textual garbage and they can parse it with a huge level of accuracy.
Some random chunks of html tables, output a csv and convert those values from imperial to metric.
Fragments of a python script and ask it to finish the function and create a readme to explain the purpose of the function. And while it's at it recreate the missing functions.
Copy paste of a multilingual website with tons of formatting and spelling errors. Ask it to fix it. Boom done.
Of course, the problem here is that developers can no longer clean their inputs as well and are encouraged to send that crappy input straight along to the LLM for processing.
There's definitely going to be a whole new wave of injection style attacks where people figure out how to reverse engineer AI company magic.
Kind of. You can't do it 100% because in theory an attacker controlling input and seeing output could reflect though intermediate layers, but if you add more intermediate steps to processing a prompt you can significantly cut down on the injection potential.
For example, fine tuning a model to take unsanitized input and rewrite it into Esperanto without malicious instructions and then having another model translate back from Esperanto into English before feeding it into the actual model, and having a final pass that removes anything not appropriate.
Won't this cause subtle but serious issue? Kinda like how pomegranate translates to "granada" in Spanish, but when you translate "granada" back to English it translates to grenade?
Two muffins are baking in an oven. One muffin turns to the other and says "sure is hot in here isn't it?"
To which the other muffin replies "Holy crap! A talking muffin!"
Changing the muffins to cookies would not make it a different joke.