Preparing a format for software evaluation
Preliminaries
Recently I was invited to a beta test of an application, as I am not a professional in the area of computer science or computing, I thought I could give my point of view as an "end user", after all, for application designers it is always useful "to design for the most awkward and dumb of possible users" ... ok, I do not fall so low on the scale of appreciation, but well, I am not an expert in these things and although I try to understand them, of Surely there are a lot of things that escape my understanding.
Well, returning to the subject, the issue is that I used the application and reviewed all the options that I was able to find, I think I used it enough to consider myself an "average user", but now I had the need to submit in writing the results of my experience with software, such a thing may not be a problem for those who are responsible for these tasks in the field of software engineering or application design, but for a "Digital Immigrant" like me, who still remembers his Early times with black screen computers with green letters, it is a task that I am not used to.
- Problem: I don't know how these reports are made?
- Solution: Seek online help and find the indications and, if possible, a format to serve as a guide.
Remembering a nickname they gave me
When I was looking for a guide to fulfill my task, I remembered an event from my past, in which I had to be an evaluator in a course taught by the University where I worked, in it the teachers work in groups of three people to evaluate each participant.
One of the evaluators, in each group, was a teacher of the students in some of the subjects or sections, the other two had nothing to do with their academic training processes and we could be considered as external observers who would be impartial.
Taking into account that the evaluation work always runs the risk of being affected by the subjective preferences of the evaluator and finding that they did not give me any list of parameter points, beyond the examination agenda (which implies that my subjective assessment could be which will guide the result of the score), because then I quickly decided to elaborate a set of minimum things that the presentation, the handling of the topic and the techniques-resources used by the student should have. I organized them quickly and prepared a Checklist to use at the time of evaluating the presentation and responses of each student.
This seemed the right thing to me, but I found that my attitude was considered in very bad taste by the student teacher and also by the other colleague who was an evaluator in my group, they were of much more "artistically free" criteria and they saw in my normative formalization a suffocating behavior of creativity and of the "expressive richness of presentations" ... Ok, I hadn't expected that.
I earned a nickname that they considered would offend me a little, in which they mixed my last name with a term they didn't like: "Baremo-Brito"

Source
Just in case it is not clear, a definition of Baremo is:
... table of calculations, which avoids the activity of carrying out these calculations to the common public or to a specific public, which is used to establish a set of rules set by an institution to assess personal merits, it is important to establish a position ordered by merits, is what justifies an acknowledgment or an achievement that explains a failure and the capacity of companies, the admission rules are a set of partial scores, analysis results, list of index numbers, etc. (Original in Spanish, translation on my own)
Source
The matter did not bother me and I went ahead with the evaluations, with the advantage that I could justify each of the scores assigned in previously established criteria and that diminished the effects of my subjective assessment. I was never selected as an evaluator again, but I learned something interesting from all that.
It must be accepted that by my character, I may seem easy to deal with, but my "normative mania" makes me a "Bizarre Bug" in many ways, I am in favor of establishing formats, following action protocols and evaluating with clear scales. So before a task that I have never done, it seems normal to find a guide and see how much I can use it to fulfill my task.
A guide or model for writing a software evaluation report
I found on the site of Software Testing Help a post that I think is what I was looking for: How To Write An Effective Test Summary Report [Sample Report Download]
In general, it gave me an idea of what I needed to do, but since it is not my area and I need to report it is my result as a user of the Beta, because I set out to prepare it as a list of points to follow to write my material, using the one provided by Software Testing Help as inspiration for that.
Now, what should I contemplate in my report? According to what I reviewed, the sections of the report they present are:
- Purpose
- Application overview
- Scope of proof
- Metrics
- Types of tests performed
- Test environment and tools
- Lessons learned
- Recommendations
- Best practices
- Exit criteria
- Conclusion / Logout
- Definitions, acronyms and abbreviations.
Some of the sections are quite specialized and when reading the descriptions I did not feel sure that I could fully comply with the technical knowledge to perform such an exhaustive software evaluation, so I had to choose the points that I could use to prepare my own test report Beta from the point of view of an end user. they looked like this:
- Purpose: Brief description of the purpose of the report.
- General description of the application: Brief description of the tested application.
- Scope of the tests: Explains the functions and sections that underwent the tests. It also indicates what could not be proved by not having the user level or by other restrictions.
- Types of tests performed: Describe the various types of Tests performed. For my case they would have to cover: Charging time, usability of the user interface, selection of color palettes, availability of communication channels, purchase-sale functions within the application, and customization options for each user.
- Environment and devices for testing: Provides details about the test environment in which the evaluation is conducted, indicating the characteristics and devices.
- Recommendations: Any solution or suggestion to improve the application can be mentioned here
- Best practices: Identification of situations, conditions and other elements that could provide an advantage when doing the evaluation work, this information could be useful for future tests.
- Conclusion: In this section the verdict is given, if the results of the evaluation allow to agree and give a "green signal" (approval) for the application or if the application does not meet the expected and touches say "It is not suggested that the application be implemented in its current state." As a Betatester, this is only a suggestion and it must be the superiors of the software company who make the decision.
Concluding the topic
So far, all this is just to be able to write a software evaluation report as a user of the Beta, surely it can be improved in many ways, but I wanted to share it and leave open the possibility that other colleagues could point out weaknesses and ways to improve this for some future use.
Post published for the @project.hope community - https://beta.steemit.com/created/hive-175254
20% of this Post is intended to support @project.hope - Project #HOPE Community

Project #HOPE Website
Hi guys, @juanmolina, @jadams2k18, @lanzjoseg, @fucho80
This post is largely due to the email of @coach.piotr and I thought it could be useful, but since I'm a newbie on the subject, I wanted to ask if they think I missed something important or I have sections that are unnecessary.
I have a version of this post in Spanish in Estudiando un formato para evaluación de software
I'll take a look of this soon :)
OK
Wow! This is a serious guide to perform a software evaluation. I compliment you on that. I didn't make it so formal. But it's good to know that there are formal guidelines for this kind of activity.
Baremo-Brito, I see it more as a compliment rather than a joke ;)
By the way, I only have one sheet of written material about the game.
Thank God you wrote it in Spanish too... hehehe
There is something I heard a long time ago and for now it is true in all cases that I have seen or lived: "There are manuals or instructions for everything you can imagine"
I took the nickname me nicely, then was I understood that this was not the intention, in that sort of thing I'm a little slow.
The reports are usually short, they are not masterful treaties on the software :) They are technical descriptions to communicate in an organized, exhaustive and brief way what the beta testers have found.
The Spanish version has slight adjustments, almost not noticeable, it is a translation thing, you know, it is never textual, there are always parts that change a bit to preserve the sense of what is said.
If the group seems good, then it can be used as a guide, but then it is time to modify and adjust according to the corrections made.
That's good to know...
It is a basis for knowing how to structure a solution. It helps a lot.
Thanks