Backend with REST API and web scraping crontab scripts to fetch personal bank transactions
For personal use only
This software will only match your needs if you:
- Are an Ibercaja customer with online read access
- Have an Afterbanks account
Otherwise, you'll have to evolve the code to web scrape your own online banking website and/or bank aggregator
- Set personal values at auth.php
- Set database connection data at the docker compose file
- Set database connection data at db.php
On your shell, from the project root folder, having Docker installed on your computer
docker compose up -d
will install required images, including PHP on Apache web server and MySQL database, and will import mock data on it.
Watch out ports on docker compose file, just in case they are already in use. Change the left one on the ports pair host:container where needed.
- Copy project files on the Apache web server document root or virtual host root
- Convert .htaccess rules to Nginx directives when in Nginx web server
- Watch out .htaccess rules possible conflicts with your virtual host configuration
- Import mock/database.sql into your MySQL instance
- Remove mock folder when on a production environment
- Include the following scripts in the crontab to start feeding your database
- ibercaja.php for direct online banking
- ing.php for your bank aggregator
- Set your cron times to your needs so your source's APIs don't ban your scripts. In my case, it's 30 minutes for Ibercaja and 1 hour for Afterbanks
- Setup HTTPS with a valid certificate and add the following rule to .htaccess\
RewriteCond %{SERVER_PORT} 80
RewriteRule ^(.*)$ https://www.yourdomain.com/$1 [R,NC,N]
- Import the localhost mock/postman_environment.json into Postman
Change the host environment variable to host:port in case you changed port 80 on the docker compose file - Import the requests mock/postman_collection.json into Postman
- Use the api/login endpoint prior to any other request to get authorized (it sets the token variable value in the environment)