Skip to content

Commit

Permalink
Merge branch 'master' of github.com:takapi327/ldbc into documentation…
Browse files Browse the repository at this point in the history
…/2023-09-Creation-of-Mdoc
  • Loading branch information
takapi327 committed Nov 14, 2023
2 parents 0bcb7f7 + 4e40d31 commit 94ebfe8
Show file tree
Hide file tree
Showing 5 changed files with 229 additions and 2 deletions.
39 changes: 39 additions & 0 deletions CONTRIBUTING.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,39 @@
# Contributing

These guidelines are meant to be a living document that should be changed and adapted as needed. We encourage changes that makes it easier to achieve our goals in an efficient way.

## Tooling

LDBC is built with [sbt](https://github.com/sbt/sbt), and you should be able to jump right in by running `sbt test`.

Please make sure to run `sbt scalafmtAll` (and commit the results!) before opening a pull request. This will take care of running both scalafmt, ensuring that the build doesn't just immediately fail to compile your work.

All PRs and branches are built using GitHub Actions, covering a large set of permutations of JVM versions, Scala versions, and operating systems.

## General Workflow

This is the process for committing code into `master`. There are of course exceptions to these rules, for example minor changes to comments and documentation, fixing a broken build etc.

1. Before starting to work on a feature or a fix, you have to make sure that:
i. There is a ticket for your work in the project's issue tracker. If not, create it first.
2. You should always perform your work in a Git feature branch. The branch should be given a descriptive name that explains its intent. Some teams also like adding the ticket number and/or the [GitHub](http://github.com) user ID to the branch name, these details is up to each of the individual teams.
3. When the feature or fix is completed you should open a [Pull Request](https://help.github.com/articles/using-pull-requests) on GitHub.
4. Pull Requests should be reviewed by other maintainers (as many as possible). Note that the maintainers can consist of external contributors. External contributors (e.g. EPFL or independent committers) are encouraged to participate in the review process.
5. After the review you should fix the issues as needed (pushing a new commit for new review etc.), iterating until the reviewers give their thumbs up.
6. Once the code has passed review the Pull Request can be merged into the `master` branch.

## Pull Request Requirements

For a Pull Request to be considered at all it has to meet these requirements:

1. Regardless if the code introduces new features or fixes bugs or regressions, it must have comprehensive tests.
2. scalafmt must be applied to all Scala source code
3. Source and binary compatibility must always be kept

If these requirements are not met then the code should not be merged into `master`, or even reviewed - regardless of how good or important it is. No exceptions.

## Work In Progress

It is ok to work on a public feature branch in the GitHub repository. Something that can sometimes be useful for early feedback etc. If so then it is preferable to name the branch accordingly. This can be done by either prefix the name with ``wip-`` as in ‘Work In Progress’, or use hierarchical names like ``wip/..``, ``feature/..`` or ``topic/..``. Either way is fine as long as it is clear that it is work in progress and not ready for merge. This work can temporarily have a lower standard. However, to be merged into `master` it will have to go through the regular process outlined above, with Pull Request, review etc..

Also, to facilitate both well-formed commits and working together, the ``wip`` and ``feature``/``topic`` identifiers also have special meaning. Any branch labelled with ``wip`` is considered “git-unstable” and may be rebased and have its history rewritten. Any branch with ``feature``/``topic`` in the name is considered “stable” enough for others to depend on when a group is working on a feature.
Original file line number Diff line number Diff line change
Expand Up @@ -467,6 +467,38 @@ object DatabaseConnectionTest extends Specification:
(result._1 === "update Odawara") and (result._2 !== Some("not update Kanagawa"))
}

"If the primary key is duplicated, the data is updated." in {
val result = (for
_ <- city.insertOrUpdates(List(City(1638, "update Kofu", "JPN", "Yamanashi", 199753))).update
updated <- city.select(v => (v.name, v.district)).where(_.id _equals 1638).query.unsafe
yield updated).transaction
.run(dataSource)
.unsafeRunSync()
(result._1 === "update Kofu") and (result._2 !== Some("not update Yamanashi"))
}

"If there are duplicate primary keys, only the specified columns are updated." in {
val result = (for
_ <- (city += City(1639, "update Kushiro", "JPN", "not update Hokkaido", 197608))
.onDuplicateKeyUpdate(_.name)
.update
updated <- city.select(v => (v.name, v.district)).where(_.id _equals 1639).query.unsafe
yield updated).transaction
.run(dataSource)
.unsafeRunSync()
(result._1 === "update Kushiro") and (result._2 !== Some("not update Hokkaido"))
}

"Data is added if the primary key is not duplicated." in {
(for
empty <- city.selectAll.where(_.id _equals 5000).query.headOption
_ <- city.insertOrUpdate((5000, "Nishinomiya", "JPN", "Hyogo", 0)).update
data <- city.selectAll.where(_.id _equals 5000).query.headOption
yield empty.isEmpty and data.nonEmpty).transaction
.run(dataSource)
.unsafeRunSync()
}

"The update succeeds in the combined processing of multiple queries." in {
(for
codeOpt <- country
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -101,7 +101,9 @@ case class TableQuery[F[_], P <: Product](table: Table[P]) extends Dynamic:
* @param values
* A list of Tuples constructed with all the property types that Table has.
*/
inline def insert(using mirror: Mirror.ProductOf[P])(values: mirror.MirroredElemTypes*): Insert[F, P] =
inline def insert(using mirror: Mirror.ProductOf[P])(
values: mirror.MirroredElemTypes*
): Insert[F, P] =
val parameterBinders = values
.flatMap(
_.zip(Parameter.fold[F, mirror.MirroredElemTypes])
Expand Down Expand Up @@ -176,6 +178,56 @@ case class TableQuery[F[_], P <: Product](table: Table[P]) extends Dynamic:
.asInstanceOf[List[ParameterBinder[F]]]
new MultiInsert[F, P, Tuple](this, tuples, parameterBinders)

/** A method to build a query model that inserts data in all columns defined in the table or updates the data if there
* are duplicate primary keys.
*
* @param mirror
* product isomorphism map
* @param values
* A list of Tuples constructed with all the property types that Table has.
*/
inline def insertOrUpdate(using mirror: Mirror.ProductOf[P])(
values: mirror.MirroredElemTypes*
): DuplicateKeyUpdateInsert[F] =
val parameterBinders = values
.flatMap(
_.zip(Parameter.fold[F, mirror.MirroredElemTypes])
.map[ParamBind](
[t] =>
(x: t) =>
val (value, parameter) = x.asInstanceOf[(t, Parameter[F, t])]
ParameterBinder[F, t](value)(using parameter)
)
.toList
)
.toList
.asInstanceOf[List[ParameterBinder[F]]]
new DuplicateKeyUpdate[F, P, Tuple](this, values.toList, parameterBinders)

/** A method to build a query model that inserts data in all columns defined in the table or updates the data if there
* are duplicate primary keys.
*
* @param values
* A class that implements a [[Product]] that is one-to-one with the table definition.
* @param mirror
* product isomorphism map
*/
inline def insertOrUpdates(values: List[P])(using mirror: Mirror.ProductOf[P]): DuplicateKeyUpdateInsert[F] =
val tuples = values.map(Tuple.fromProductTyped)
val parameterBinders = tuples
.flatMap(
_.zip(Parameter.fold[F, mirror.MirroredElemTypes])
.map[ParamBind](
[t] =>
(x: t) =>
val (value, parameter) = x.asInstanceOf[(t, Parameter[F, t])]
ParameterBinder[F, t](value)(using parameter)
)
.toList
)
.asInstanceOf[List[ParameterBinder[F]]]
new DuplicateKeyUpdate[F, P, Tuple](this, tuples, parameterBinders)

/** A method to build a query model that updates specified columns defined in a table.
*
* @param tag
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,7 @@ package ldbc.query.builder.statement
import ldbc.core.Column
import ldbc.sql.*
import ldbc.query.builder.TableQuery
import ldbc.query.builder.interpreter.Tuples

/** Trait for building Statements to be added.
*
Expand All @@ -16,10 +17,31 @@ import ldbc.query.builder.TableQuery
* Base trait for all products
*/
private[ldbc] trait Insert[F[_], P <: Product] extends Command[F]:
self =>

/** Trait for generating SQL table information. */
/** A model for generating queries from Table information. */
def tableQuery: TableQuery[F, P]

/** Methods for constructing INSERT ... ON DUPLICATE KEY UPDATE statements. */
def onDuplicateKeyUpdate[T](func: TableQuery[F, P] => T)(using
Tuples.IsColumnQuery[F, T] =:= true
): DuplicateKeyUpdateInsert[F] =
val duplicateKeys = func(self.tableQuery) match
case tuple: Tuple => tuple.toList.map(column => s"$column = new_${ tableQuery.table._name }.$column")
case column => List(s"$column = new_${ tableQuery.table._name }.$column")
new DuplicateKeyUpdateInsert[F]:
override def params: Seq[ParameterBinder[F]] = self.params

override def statement: String =
s"${ self.statement } AS new_${ tableQuery.table._name } ON DUPLICATE KEY UPDATE ${ duplicateKeys.mkString(", ") }"

/** Insert trait that provides a method to update in case of duplicate keys.
*
* @tparam F
* The effect type
*/
trait DuplicateKeyUpdateInsert[F[_]] extends Command[F]

/** A model for constructing INSERT statements that insert single values in MySQL.
*
* @param tableQuery
Expand Down Expand Up @@ -121,3 +143,34 @@ case class SelectInsert[F[_], P <: Product, T](
case (value: Any, parameter: Any) =>
ParameterBinder[F, Any](value)(using parameter.asInstanceOf[Parameter[F, Any]])
})

/** A model for constructing ON DUPLICATE KEY UPDATE statements that insert multiple values in MySQL.
*
* @param tableQuery
* Trait for generating SQL table information.
* @param tuples
* Tuple type value of the property with type parameter P.
* @param params
* A list of Traits that generate values from Parameter, allowing PreparedStatement to be set to a value by index
* only.
* @tparam F
* The effect type
* @tparam P
* Base trait for all products
* @tparam T
* Tuple type of the property with type parameter P
*/
case class DuplicateKeyUpdate[F[_], P <: Product, T <: Tuple](
tableQuery: TableQuery[F, P],
tuples: List[T],
params: Seq[ParameterBinder[F]]
) extends DuplicateKeyUpdateInsert[F]:

private val values = tuples.map(tuple => s"(${ tuple.toArray.map(_ => "?").mkString(", ") })")

private val duplicateKeys = tableQuery.table.all.map(column => s"$column = new_${ tableQuery.table._name }.$column")

override val statement: String =
s"INSERT INTO ${ tableQuery.table._name } (${ tableQuery.table.all.mkString(", ") }) VALUES${ values.mkString(
", "
) } AS new_${ tableQuery.table._name } ON DUPLICATE KEY UPDATE ${ duplicateKeys.mkString(", ") }"
Original file line number Diff line number Diff line change
Expand Up @@ -129,6 +129,57 @@ class TableQueryTest extends AnyFlatSpec:
Test(2L, "p2", None)
)).statement === "INSERT INTO test (`p1`, `p2`, `p3`) VALUES(?, ?, ?), (?, ?, ?)"
)
assert(
query
.insertOrUpdate((1L, "p2", Some("p3")))
.statement === "INSERT INTO test (`p1`, `p2`, `p3`) VALUES(?, ?, ?) AS new_test ON DUPLICATE KEY UPDATE `p1` = new_test.`p1`, `p2` = new_test.`p2`, `p3` = new_test.`p3`"
)
assert(
query
.insertOrUpdates(
List(
Test(1L, "p2", Some("p3")),
Test(2L, "p2", None)
)
)
.statement === "INSERT INTO test (`p1`, `p2`, `p3`) VALUES(?, ?, ?), (?, ?, ?) AS new_test ON DUPLICATE KEY UPDATE `p1` = new_test.`p1`, `p2` = new_test.`p2`, `p3` = new_test.`p3`"
)
assert(
query
.insert((1L, "p2", Some("p3")))
.onDuplicateKeyUpdate(_.p1)
.statement === "INSERT INTO test (`p1`, `p2`, `p3`) VALUES(?, ?, ?) AS new_test ON DUPLICATE KEY UPDATE `p1` = new_test.`p1`"
)
assert(
query
.insert((1L, "p2", Some("p3")))
.onDuplicateKeyUpdate(v => (v.p1, v.p2, v.p3))
.statement === "INSERT INTO test (`p1`, `p2`, `p3`) VALUES(?, ?, ?) AS new_test ON DUPLICATE KEY UPDATE `p1` = new_test.`p1`, `p2` = new_test.`p2`, `p3` = new_test.`p3`"
)
assert(
(query += Test(1L, "p2", Some("p3")))
.onDuplicateKeyUpdate(_.p1)
.statement === "INSERT INTO test (`p1`, `p2`, `p3`) VALUES(?, ?, ?) AS new_test ON DUPLICATE KEY UPDATE `p1` = new_test.`p1`"
)
assert(
(query += Test(1L, "p2", Some("p3")))
.onDuplicateKeyUpdate(v => (v.p1, v.p2, v.p3))
.statement === "INSERT INTO test (`p1`, `p2`, `p3`) VALUES(?, ?, ?) AS new_test ON DUPLICATE KEY UPDATE `p1` = new_test.`p1`, `p2` = new_test.`p2`, `p3` = new_test.`p3`"
)
assert(
(query ++= List(
Test(1L, "p2", Some("p3")),
Test(2L, "p2", None)
)).onDuplicateKeyUpdate(_.p1)
.statement === "INSERT INTO test (`p1`, `p2`, `p3`) VALUES(?, ?, ?), (?, ?, ?) AS new_test ON DUPLICATE KEY UPDATE `p1` = new_test.`p1`"
)
assert(
(query ++= List(
Test(1L, "p2", Some("p3")),
Test(2L, "p2", None)
)).onDuplicateKeyUpdate(v => (v.p1, v.p2, v.p3))
.statement === "INSERT INTO test (`p1`, `p2`, `p3`) VALUES(?, ?, ?), (?, ?, ?) AS new_test ON DUPLICATE KEY UPDATE `p1` = new_test.`p1`, `p2` = new_test.`p2`, `p3` = new_test.`p3`"
)
}

it should "The update query statement generated from Table is equal to the specified query statement." in {
Expand Down

0 comments on commit 94ebfe8

Please sign in to comment.