Skip to content

Commit

Permalink
v1.0.7
Browse files Browse the repository at this point in the history
  • Loading branch information
kawadhiya21 committed Oct 17, 2020
1 parent 509f093 commit 65c5208
Show file tree
Hide file tree
Showing 7 changed files with 733 additions and 34 deletions.
125 changes: 120 additions & 5 deletions README.MD
Original file line number Diff line number Diff line change
Expand Up @@ -37,10 +37,37 @@ var connection = mysql.createPool({
migration.init(connection, __dirname + '/migrations');
```

### Advanced Setup
If you want to execute something at the end of any migration, you can add third parameter as callback function. Example:

```
# migration.js
var mysql = require('mysql');
var migration = require('mysql-migrations');
var connection = mysql.createPool({
connectionLimit : 10,
host : 'localhost',
user : 'root',
password : 'password',
database : 'your_database'
});
migration.init(connection, __dirname + '/migrations', function() {
console.log("finished running migrations");
});
```

## Adding Migrations

### Initiate a migration
Run `node migration.js add migration create_table_users`. Now open the migrations folder. Locate the newest file with greatest timestamp as it predecessor. The file will have the name which was specified in the command such as `12213545345_create_table_users.js`
Run

```
node migration.js add migration create_table_users
```

Now open the migrations folder. Locate the newest file with greatest timestamp as it predecessor. The file will have the name which was specified in the command such as `12213545345_create_table_users.js`

### Add migrations
Write the query in `up` key of the json created for the forward migration. As a part of good practice, also write the script to rollback the migration in `down` key. Ex.
Expand All @@ -53,7 +80,11 @@ module.exports = {
```

### Add seed
Run `node migration.js add seed create_table_users` to add a seed.
Run
```
node migration.js add seed create_table_users
```
to add a seed.

```
module.exports = {
Expand All @@ -64,7 +95,11 @@ module.exports = {

### Initate and Add migration in single command

Run `node migration.js add migration create_table_users "CREATE TABLE mysql_migrations_347ertt3e (user_id INT NOT NULL, UNIQUE KEY user_id (user_id) )"`. Locate the newest file with greatest timestamp as it predecessor and open it. Query will be automatically added as `up` key. However `down` key needs to be filled manually.
Run
```
node migration.js add migration create_table_users "CREATE TABLE mysql_migrations_347ertt3e (user_id INT NOT NULL, UNIQUE KEY user_id (user_id) )"
```
Locate the newest file with greatest timestamp as it predecessor and open it. Query will be automatically added as `up` key. However `down` key needs to be filled manually.

### Custom migrations
You may initiate the migration file and add a function.
Expand All @@ -88,6 +123,15 @@ There are few ways to run migrations.
3. Run `node migration.js down`. Runs only 1 `down` migrations.
4. Run `node migration.js refresh`. Runs all down migrations followed by all up.

Example Output:

```
UP: "CREATE TABLE users2 (user_id INT NOT NULL, UNIQUE KEY user_id (user_id), name TEXT )"
UP: "CREATE TABLE users (user_id INT NOT NULL, UNIQUE KEY user_id (user_id), name TEXT )"
UP: "CREATE TABLE users1 (user_id INT NOT NULL, UNIQUE KEY user_id (user_id), name TEXT )"
No more "UP" migrations to run
```

### Execute anonymous migrations
At times, few migrations need to run again or anonymously. There could be variety of reasons old migrations need to be executed or rollbacked. It can be done this way.

Expand All @@ -103,12 +147,83 @@ node migration.js run 1500891087394_create_table_users.js down

>> Since these are anonymous executions, no records are maintained for any executions.
## Executing backdated migrations
Suppose there are few migrations which were merged late into the main branch. Technically, they should not run because they are old migrations and should already have been run, but it happens that someone has been working on a branch for long and once merged into master, the older migrations do not work because the latest migration timestamp is greater. There is a flag which can help in running those migrations. There are two ways:
1. Single Run
```
node migration.js up --migrate-all
```
2. For all runs, you can configure in the init part itself.
```
# migration.js
var mysql = require('mysql');
var migration = require('mysql-migrations');
var connection = mysql.createPool({
connectionLimit : 10,
host : 'localhost',
user : 'root',
password : 'password',
database : 'your_database'
});
migration.init(connection, __dirname + '/migrations', function() {}, ["--migrate-all"]);
```
and then run migrations normally

```
node migration.js up
```

## Saving Schema and Loading from Schema
Having schema.sql is good because it will help you review the schema changes in PR itself. The schema is stored in migrations folder under the file name `schema.sql`. There are two ways to generate schema.
1. Single Run
```
node migration.js up --update-schema
```
2. For all runs, you can configure in the init part itself.
```
# migration.js
var mysql = require('mysql');
var migration = require('mysql-migrations');
var connection = mysql.createPool({
connectionLimit : 10,
host : 'localhost',
user : 'root',
password : 'password',
database : 'your_database'
});
migration.init(connection, __dirname + '/migrations', function() {}, ["--update-schema"]);
```
and then run migrations normally

```
node migration.js up
node migration.js down 2
..
```

Updated Schema will be stored in migrations folder after each run.

### Loading Directly from Schema
Suppose you setup your project and you want to avoid running the entire migrations and simply want to load it from schema generated in the above process. You can do it via:

```
node migration.js load-from-schema
```

The above command will create all the tables and make entry in the logs table. It is helpful when setting up projects for newbies or environments without running entire migration process.

## Pending
>>Test cases: Will pick up when I get time.
## Help and Support
Will be more than happy to improve upon this version. This is an over night build and needs to be improved certainly. Will welcome everyone who wants to contribute.

## Credits and other stuff
It is my first contribution to npm and I am sort of happy over it. I made this when I was really looking for a suitable tool with barebone settings allowing me to maintain database structure. I could not find a basic one and hence wrote my own and finally decided to publish. It took me around 2 hours to write the first version which barely works. But it still does my job.

>>Credits to [ramnique](https://github.com/ramnique/) (I worked with him at Stayzilla and he is a great mentor).
>>Credits to my parents.
>>And of course to my parents.
118 changes: 117 additions & 1 deletion core_functions.js
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,8 @@ var fs = require("fs");

var fileFunctions = require('./file');
var queryFunctions = require('./query');
var colors = require('colors');
var exec = require('child_process').exec;
var table = require('./config')['table'];

function add_migration(argv, path, cb) {
Expand Down Expand Up @@ -32,6 +34,33 @@ function add_migration(argv, path, cb) {
}

function up_migrations(conn, max_count, path, cb) {
queryFunctions.run_query(conn, "SELECT timestamp FROM " + table + " ORDER BY timestamp DESC LIMIT 1", function (results) {
var file_paths = [];
var max_timestamp = 0;
if (results.length) {
max_timestamp = results[0].timestamp;
}

fileFunctions.readFolder(path, function (files) {
files.forEach(function (file) {
var timestamp_split = file.split("_", 1);
if (timestamp_split.length) {
var timestamp = parseInt(timestamp_split[0]);
if (Number.isInteger(timestamp) && timestamp.toString().length == 13 && timestamp > max_timestamp) {
file_paths.push({ timestamp : timestamp, file_path : file});
}
} else {
throw new Error('Invalid file ' + file);
}
});

var final_file_paths = file_paths.sort(function(a, b) { return (a.timestamp - b.timestamp)}).slice(0, max_count);
queryFunctions.execute_query(conn, path, final_file_paths, 'up', cb);
});
});
}

function up_migrations_all(conn, max_count, path, cb) {
queryFunctions.run_query(conn, "SELECT timestamp FROM " + table, function (results) {
var file_paths = [];
var timestamps = results.map(r => parseInt(r.timestamp));
Expand Down Expand Up @@ -85,9 +114,96 @@ function run_migration_directly(file, type, conn, path, cb) {
queryFunctions.run_query(conn, query, cb);
}

function update_schema(conn, path, cb) {
var conn_config = conn.config.connectionConfig;
var filePath = path + '/' + 'schema.sql';
fs.unlink(filePath, function() {
var cmd = "mysqldump --no-data ";
if (conn_config.host) {
cmd = cmd + " -h " + conn_config.host;
}

if (conn_config.port) {
cmd = cmd + " --port=" + conn_config.port;
}

if (conn_config.user) {
cmd = cmd + " --user=" + conn_config.user;
}

if (conn_config.password) {
cmd = cmd + " --password=" + conn_config.password;
}

cmd = cmd + " " + conn_config.database;
exec(cmd, function(error, stdout, stderr) {
fs.writeFile(filePath, stdout, function(err) {
if (err) {
console.log(colors.red("Could not save schema file"));
}
cb();
});
});
});
}

function createFromSchema(conn, path, cb) {
var conn_config = conn.config.connectionConfig;
var filePath = path + '/' + 'schema.sql';
if (fs.existsSync(filePath)) {
var cmd = "mysql ";
if (conn_config.host) {
cmd = cmd + " -h " + conn_config.host;
}

if (conn_config.port) {
cmd = cmd + " --port=" + conn_config.port;
}

if (conn_config.user) {
cmd = cmd + " --user=" + conn_config.user;
}

if (conn_config.password) {
cmd = cmd + " --password=" + conn_config.password;
}

cmd = cmd + " " + conn_config.database;
cmd = cmd + " < " + filePath;
exec(cmd, function(error, stdout, stderr) {
if (error) {
console.log(colors.red("Could not load from Schema: " + error));
cb();
} else {
var file_paths = [];
fileFunctions.readFolder(path, function (files) {
files.forEach(function (file) {
var timestamp_split = file.split("_", 1);
var timestamp = parseInt(timestamp_split[0]);
if (timestamp_split.length) {
file_paths.push({ timestamp : timestamp, file_path : file});
} else {
throw new Error('Invalid file ' + file);
}
});

var final_file_paths = file_paths.sort(function(a, b) { return (a.timestamp - b.timestamp)}).slice(0, 9999999);
queryFunctions.execute_query(conn, path, final_file_paths, 'up', cb, false);
});
}
});
} else {
console.log(colors.red("Schema Missing: " + filePath));
cb();
}
}

module.exports = {
add_migration: add_migration,
up_migrations: up_migrations,
up_migrations_all: up_migrations_all,
down_migrations: down_migrations,
run_migration_directly: run_migration_directly
run_migration_directly: run_migration_directly,
update_schema: update_schema,
createFromSchema: createFromSchema
};
4 changes: 4 additions & 0 deletions file.js
Original file line number Diff line number Diff line change
Expand Up @@ -13,6 +13,10 @@ function readFolder(path, cb) {
throw err;
}

var schemaPath = files.indexOf("schema.sql");
if (schemaPath > -1) {
files.splice(schemaPath, 1);
}
cb(files);
});
}
Expand Down
Loading

0 comments on commit 65c5208

Please sign in to comment.