How to Use and Interpret AI Content Detectors

in Steem Alliance2 years ago

Greetings fellow Steeminians.

In this opportunity I would like to share with all of you some considerations "according to my point of view" on how contents that are likely to be created using artificial intelligence applications should be evaluated.

Pixabay

After the Steemit team has given the order to attend and denounce the possible contents created by artificial intelligence when considering them as plagiarism, many cases have been presented in terms of review and evaluation, both with the tools that are used as well as with the analysis of the results obtained.

image.png

The first thing we must be clear on this issue, is that AI content detector tools only provide a probability or rather, an estimate of the existence of paraphrases in a content in relation to a large amount of information taken from the internet, so, this review task deserves some rigor when evaluated, for this, and "according to my criteria and experience" it is necessary to review the content using several tools in order to obtain greater support in the results.

On the other hand, it is important to keep in mind the large amount of information that is shared on the Internet, this makes that there is an increasingly wide way to explain and develop any topic, so the probability of falling into repetitions of phrases will tend to increase over time, that is, we must have a high range (percentage) of similarity or paraphrasing when analyzing the results to apply a judgment on the content under review.

Summarizing these two points, it is necessary to review a content using at least 2 AI content detector tools, on the other hand, we must have a range of probability greater than 70% between the tools used.

Below I share some tools to detect IA content in order of preference.**

a. https://openai-openai-detector.hf.space/
b. https://www.zerogpt.com/
c. https://paraphrasingtool.ai/ai-content-detector/
d. https://contentatscale.ai/ai-content-detector/
e. https://crossplag.com/ai-content-detector/

It is important that each content that presents a high percentage of AI content is submitted under plagiarism detectors, as it may possibly be copying content directly from a certain source.

Extra advice when reviewing a content:

Do not use presentation paragraphs or links generated by images, by this I mean, we will only apply the review to the main development of the publication.

Let's look at an example:

We recently reviewed the content of a publication that had been diagnosed as "AI Content".

OpenaiZeroGPTParaphrasi

See very well the difference in results, where the Openai tool is the one that registers the highest probability that the content is generated by an artificial intelligence application, but the other detectors show a high percentage that the content is made by a human, you can also see that I did not take into account the paragraph of presentation, nor the links within the content, as these influence the result.

This does not mean that the tools are unreliable, but rather that they are probabilities and therefore we should be as cautious as possible when making a "human" diagnosis or judgment of the content reviewed.

An example of analysis of the results:

According to my point of view, the content presents a low rate of paraphrasing taking into consideration the result of the three detectors used, therefore, it will be taken as original content, however, the moderate percentage yielded by a search tool tells me that I should be alert of the content shared by the author.

image.png

In the following example the results show a clear trend that the content has been created through artificial intelligence, the two search tools found more than 80% paraphrasing in relation to other sources.

Openai
ZeroGPT

An example of analysis of the results:

The content of this publication should be taken as "AI Content", therefore, you should call upon the author of this publication to avoid continuing to use this method of content creation.

image.png

Cc:
@steemcurator01
@abuse-watcher
@remlaps
@ashkhan

I thank you for the support and attention you have given me.

I hope it will be helpful and useful.

Give your support to Steem and SBD through Coinmarket, by clicking on "Good".

image.png

I cannot say goodbye without first inviting you to join the #Club5050, where the goal is to accelerate the growth of our accounts to provide greater support in our communities.

To be part of the #club5050, you just have to make sure that during the last month you have more Steem activated than Steem withdrawn.

Let's go for more

World Of Xpilar


ADEL.jpg

source

Sort:  

Thank you, friend!
I'm @steem.history, who is steem witness.
Thank you for witnessvoting for me.
image.png
please click it!
image.png
(Go to https://steemit.com/~witnesses and type fbslo at the bottom of the page)

The weight is reduced because of the lack of Voting Power. If you vote for me as a witness, you can get my little vote.

Thank you so much for sharing post. It's very important as we all were confusing in all these apps. You have done a great work. I will try to share this with maximum moderators & users that it will make their work more easy.

It is important to manage a balanced support as better and more effective tools become available, it is our duty to watch over the wellbeing of the platform.

Thank you so much for guidance as its needed few app shows AI and other not. Hopefully with coordination we will manage it.

Yes, as we move forward we will find better ways to correctly evaluate these cases.

Thank you so much for the information and guidelines. Yes you are right that we must verify the results of one detector from another detector and then finalize our decision.

I just have a question, there are many users on the platform who use the google translator or any other translator to publish their post in English because they don't know English much. But when we detect that translated content, it shows more percentages (sometimes 99% generated from AI bots) in different tools.

So, what should we do in those cases?

Greetings friend @ steemdoctor1

Those cases are interesting, each one will have to evaluate when publishing, since not all of them have a good writing, that is, they tend to repeat many words or phrases, there are also those who tend to rely on internet sources without generating a good analysis of their own. , these factors can be responsible for the high percentage that the detectors throw.

Heres a free vote on behalf of @se-witness.

The AI related tools offer exciting possibilities for creating educational content as well. For those interested in utilizing AI for course creation, https://ai-depot.net/tools/leiapix/ provides valuable resources. They help explore how AI can be used to generate educational content efficiently. Embracing AI in education can streamline content creation and enhance learning experiences.

Coin Marketplace

STEEM 0.18
TRX 0.18
JST 0.034
BTC 89179.04
ETH 3123.84
USDT 1.00
SBD 2.74