At times, it becomes necessary to develop intricate automations before or after collecting data. The versatility of RTILA allows seamless integration with virtually all RPA tools available in the market.
Today, I will demonstrate how to invoke an RTILA-compiled bot from (almost) any RPA tool.
The flow is like the image displayed below.
To begin, open RTILA and create a basic project. In my case, I will be scraping some posts.
In this example, we will extract a bunch of post titles, dates, and links. Create a simple project on your end, just to play with.
Navigate to BOTS and compile your project into a standalone bot. Since I am using Windows, I will save it in the «c:\bots» folder on my local disk with the name rtilabot.bot
I have chosen JSON as the output format.
Now download the RTILA bot launcher and store it on the same folder.
https://github.com/IKAJIAN/rtila-bot-launcher/releases/tag/v6.0.0
We are going to create a couple of files in the same folder. Open your favourite text editor and create the first one, «start.bat».
start "" "c:\bots\rtila-cli-win.exe" "c:\bots\rtilabot.bot" cscript //nologo sendEnter.vbs
And create a second file, with the name «sendEnter.vbs»
Set WshShell = WScript.CreateObject("WScript.Shell") WScript.Sleep 1000 'Wait for 1 second (1000 milliseconds) WshShell.SendKeys "{ENTER}"
Currently, the RTILA launcher requires us to manually select the bot for execution. We are automating this process by sending the ENTER key.
We have completed the RTILA project. To test it, you can either run start.bat from the command line.
Shortly, a newly created file will contain the scraped information. The contents of the results in my scrape are as follows.
Using Power Automate / WinAutomation
This project demonstrates the integration of an RTILA-compiled bot into Power Automate.
This is the automation steps that we will create:
The equivalent version in Power Automate is identical. In this tutorial, I will use WinAutomation screenshots because they are in English instead of Spanish:
In this automation, we execute the start.bat file to initiate the RTILA bot. The first action is configured as follows.
Don’t worry about setting any wait time; it’s not necessary, as our RPA tool will wait for our bot to finish.
The next step is to read the JSON files, retrieving the most recent one.
The advanced tab settings:
Then we open the file and read its contents.
The next step is to convert the results from JSON to Custom Object. This action is giving you the ability to translate JSON into a Custom Object and store it inside a new variable for later use.
Our final step involves accessing one of the results and display it on the screen.
With the automation in place, you can easily access and utilize the scraped information as needed.
Using Robomotion RPA
Working with Robomotion is quite similar.
The Inject and Stop nodes are the start and the end of the automation.
You’ll notice the «Start RTILA Process» node, which is very easy to set up.
These variables are defined in the previous JavaScript (JS) function node:
msg.rtilaExecutablePath = "C:\\bots\\start.bat" msg.workingDirectory = "C:\\bots\\"; return msg;
When RTILA task is done, we can list the directory and read the most recent file as we did with Power Automate.
This is the input/output of the List Directory node.
We prepare the path to read the file properly from the JS function node.
msg.resultsPath = msg.workingDirectory.concat(msg.files[0].Name); return msg;
Then we Read the File as follows:
And parse the JSON results.
msg.parsedResults = JSON.parse(msg.text); return msg;
Now we are ready to work with the results, but for this example I will access some element and just display it.
This is the resulting execution from the flow designer:
Here we conclude our discussion on using RTILA with RPA tools, I hope the integration guidance and detailed examples, complete with screenshots, have proven helpful.
By following these steps, you can effectively call the RTILA scraper with other similar RPA tools and improve your workflows.
Good luck with your RTILA and RPA endeavors, and thank you for reading!
Farewell, my fellow scraper titan!