Software Developer with solid experience in Robotic Process Automation (RPA), backend development, and systems integration. I have worked both as an in-house developer and as a freelance RPA consultant for companies in Brazil and abroad.
My background includes designing automation for large healthcare institutions and developing machine-level algorithms in aerospace manufacturing. While my experience spans multiple areas of software engineering, my current professional focus is on RPA and intelligent process automation.
I’m passionate about optimizing workflows, eliminating manual tasks, and delivering scalable, high-impact automation solutions. I am actively seeking international opportunities where I can contribute as an RPA specialist.
Outside work, I enjoy playing piano, studying languages, board games, cycling, and volunteering as a catechist.
My Projects
Detailed projects with videos and explanations.
BlinkAccess – Eye Blink Control for People with Motor Paralysis
Imagine being able to control a computer using only your eyes — even when the rest of your body cannot move.
BlinkAccess is a system that combines my passion for technology with my desire to help others and make life easier.
It allows individuals with total motor paralysis to interact with a computer and call for help using simple eye blinks.
New tasks can be added easily, and because it’s open source, anyone can access and contribute to it.
While many people see RPA as something that “takes jobs away,” I see it as a powerful tool to give people dignity, autonomy, and a better life — letting computers handle the hard work.
Using a regular webcam and intelligent algorithms, BlinkAccess detects different blink patterns to execute essential commands such as:
Calling for medical assistance with specific blink sequences
Playing and pausing music on YouTube
Navigating commands easily and intuitively
The system combines advanced computer vision techniques with operating system automation and voice feedback to provide an accessible, lightweight, and easy-to-use experience — all running in real-time on a standard laptop.
How It Works
BlinkAccess uses the webcam to track the user’s face with MediaPipe technology, identifying key eye landmarks. Based on the Eye Aspect Ratio (EAR) metric, it recognizes short and long blinks, interpreting them as commands to control computer actions.
To ensure safety and reduce errors, the system includes confirmation modes activated by long blinks, automatic rest periods, and audio feedback that guides the user through every step.
Why BlinkAccess?
This project is a real social solution with the potential to improve the quality of life for people with limited mobility by providing autonomy and safety through technology.
Its clean and modular codebase allows for future enhancements and adaptation to diverse assistive needs.
You can review the code on my GitHub repository. Or see BlinkAccess in Action:
ASL Sign Recognition – Translating Hand Signs into Speech
Imagine being able to communicate through hand signs and having a computer instantly recognize and translate them for you.
ASL Sign Recognition is a project that combines my love for technology with my desire to break communication barriers and help others.
Originally, I developed a basic hand detection system using MediaPipe about four years ago. Recently, I revisited this old project and upgraded it to recognize ASL letters.
For now, it recognizes the letters A, B, and C, but I plan to implement the entire alphabet, recognize words, and eventually integrate a voice system to speak the translated letters or words out loud.
While similar systems already exist, developing my own from scratch is a way to test and deepen my skills — and to create assistive solutions that are fully customizable and open source.
Using a regular webcam and intelligent computer vision algorithms, the system can:
Detect ASL letters (currently A, B, and C)
Recognize hand sign patterns in real time
Enable future translation of words and sentences with speech output
How It Works
The system uses the webcam to capture the user’s hand and MediaPipe technology to identify its landmarks.
By analyzing the relative positions of fingers and hand orientation, it classifies each sign into the corresponding ASL letter.
Future versions will include a complete alphabet recognition and Natural Language Processing to interpret full words and phrases.
Why ASL Sign Recognition?
This project is a small step towards bridging the communication gap for the deaf and hard-of-hearing community, enabling smoother interactions and supporting inclusive technologies.
My friend’s parents are deaf, and they offered to test it as it evolves, motivating me even more to build something meaningful.
You can review the code on my GitHub repository. Or see ASL Sign in Action:
Fuzzy Shell for Decision Support Systems
Final graduation project developed with Paulo R. E. de Oliveira Jr. for our B.Sc. in Computer Science.
The goal was to build a reusable Shell capable of generating fuzzy logic systems with a graphical interface,
enabling experts from any domain to model decision-making processes involving uncertainty and imprecise data.
We implemented a fuzzy inference engine supporting trapezoidal membership functions and rule evaluation using
Mamdani-style logic (Min/Max operators). The system includes:
Definition of linguistic variables and fuzzy sets
Creation of fuzzy rules using AND/OR logic
Inference calculation and defuzzification via the Centroid method
Web interface built with Django for model management and simulation
Users can define input/output variables, create inference rules, test different scenarios, and view results graphically.
The prototype was validated with 25 students and compared to the JFuzzyLogic framework, proving effective and intuitive.
The code is not publicly available because it is part of a thesis project...
Eight Queens – Solving the Classic Chess Problem with Backtracking
Have you ever wondered how to place eight queens on a chessboard so that none of them attack each other?
The Eight Queens Problem is a famous computer science challenge that tests logical reasoning, recursion, and backtracking techniques.
This Java project implements a complete solution for the Eight Queens problem using backtracking to explore all possible safe positions on an 8x8 chessboard.
Each queen is placed in a row while checking if its column and diagonals are free. If no valid position is found, the algorithm backtracks to adjust previous placements.
How It Works
The algorithm places queens one by one in different rows, moving left to right across columns. Before placing a queen, it checks:
If another queen exists in the same column
If another queen exists on the upper-left diagonal
If another queen exists on the upper-right diagonal
If a valid position is found, it places the queen and proceeds to the next row. Otherwise, it removes the last placed queen and continues exploring alternatives (backtracking).
Why Eight Queens?
This project was one of my first implementations of backtracking algorithms in Java. It strengthened my understanding of recursion and problem decomposition, crucial for competitive programming and algorithm interviews.
Revisiting this code today reminds me how foundational challenges like this shape a developer’s analytical thinking.
You can review the full code on my GitHub repository:
BackPropagation Neural Network – Learning by Adjusting Weights
Have you ever wondered how neural networks actually learn? BackPropagation is the core algorithm behind training many artificial neural networks used today in AI and machine learning applications.
This Java project implements a simple feedforward neural network with BackPropagation learning. It adjusts the weights of neurons based on the error between expected and actual outputs, reducing the error over time to improve prediction accuracy.
How It Works
The neural network consists of an input layer, one hidden layer, and an output layer. The BackPropagation algorithm operates in two phases:
Forward pass: Input data propagates through the network to generate an output.
Backward pass: The error between expected and predicted output is calculated, and gradients are propagated back through the network to adjust weights and biases using the learning rate.
This iterative process continues for multiple epochs until the network reaches a desired accuracy or minimum error threshold.
Why BackPropagation?
This was my first implementation of neural network training from scratch without using any external ML libraries. It helped me understand the mathematics of gradient descent, activation functions, and weight updates — fundamental concepts behind modern AI models.
Revisiting this code today reminds me of the importance of understanding the underlying algorithms that power high-level frameworks.
You can review the full code on my GitHub repository:
Hebbian Learning – The Algorithm That Mimics Brain Synapses
“Neurons that fire together wire together.” This simple principle, proposed by Donald Hebb in 1949, became the foundation of Hebbian Learning – a neural learning algorithm inspired by the way biological neurons strengthen their connections.
This Java project implements the Hebb learning rule to train a single-layer neural network for pattern recognition. Unlike BackPropagation, Hebbian learning is an unsupervised method that strengthens the connection between neurons that activate simultaneously.
How It Works
The algorithm updates the weights of each neuron using the formula:
wi = wi + xi * y
Where xi is the input and y is the output of the neuron. Over multiple training examples, weights adjust to represent the correlation between inputs and outputs, enabling the network to recognize specific patterns.
Why Hebbian Learning?
This project helped me understand the origins of neural network learning before the advent of gradient-based methods like BackPropagation. Hebbian learning remains influential in neuroscience and AI research, especially in unsupervised and reinforcement learning models.
Revisiting this implementation reinforced my understanding of how biological inspiration drives computational algorithms.
You can review the full code on my GitHub repository:
Have you ever wondered how programming languages translate your code into machine instructions? Compilers are the backbone of this process, analyzing, interpreting, and converting high-level languages into executable form.
This Java project implements a basic compiler for the LMS (Language Made Simple) language, built entirely from scratch as part of my formal languages and compilers studies.
It includes lexical analysis, syntax analysis (parsing), semantic checking, and code generation stages, providing a full pipeline for transforming source code into target output.
How It Works
The compiler performs the following steps:
Lexical Analysis: Tokenizes the input source code, identifying keywords, identifiers, numbers, and symbols.
Syntax Analysis: Parses the sequence of tokens based on LMS grammar rules to build a parse tree representing the program structure.
Semantic Analysis: Checks for meaning correctness, such as variable declarations and type consistency.
Code Generation: Produces an intermediate or final code output simulating how a real compiler generates machine code or bytecode.
The project demonstrates the design of deterministic finite automata, parsing techniques, and compiler architecture principles.
Why Build a Compiler?
Writing a compiler from scratch deepened my understanding of how programming languages work internally, how syntax and semantics are defined, and how interpreters and runtime environments process code.
It strengthened my skills in theoretical computer science, algorithm design, and software architecture — essential knowledge for any developer aiming to master backend systems and language processing.
You can review the full code on my GitHub repository:
"Beatriz is a professional with great potential. Very studious and dedicated. If she doesn't know how to do something, she is willing to learn and soon executes with quality. It's clear she loves technology."
— Henrique Costa, Software QA Engineer (worked with Beatriz, Aug 2021)
"Beatriz is a highly qualified professional. I had the opportunity to work on the same team and closely follow her work. She is very dedicated, always bringing new solutions and accurate analysis. Communicative, intelligent, and a great team player. I recommend Beatriz for any IT team."
— Julio Henrique, Systems Analyst, C# Developer at B3 S.A. (team member, Aug 2021)
"I've worked with Bia for years in technology. It's easy to spot her dedication, helpfulness, and competence. Always evolving and maintaining quality user support with fast solutions."
— Ezequiel C. Silveira, Senior IT Infrastructure Analyst (team member, Jun 2020)
"Competent professional, committed, great at teamwork, problem solver, passionate about technology, always willing to help users with politeness and efficiency. Her professional growth is noticeable daily. A promising future ahead."
— Geruza Silva, Anima Educação (team member, Apr 2020)
"I had the pleasure to supervise Beatriz for 2 years. She was our intern and later hired without hesitation. Competent, hardworking, always striving to give the best to users. Caring, committed, motivated. She has a bright future."
— Tatiane dos Santos Leal, IT and Processes Coordinator at Fundação InoversaSul (manager, Apr 2020)
Get in Touch
If you want to discuss projects, collaborations, or opportunities, feel free to connect with me on my networks below or send a direct message through the form.