Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Doc: Modify docs to fix old naming #10199

Merged
merged 6 commits into from
Apr 23, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
5 changes: 2 additions & 3 deletions datafusion-cli/src/catalog.rs
Original file line number Diff line number Diff line change
Expand Up @@ -345,10 +345,9 @@ mod tests {
if cfg!(windows) { "USERPROFILE" } else { "HOME" },
test_home_path,
);
let input =
"~/Code/arrow-datafusion/benchmarks/data/tpch_sf1/part/part-0.parquet";
let input = "~/Code/datafusion/benchmarks/data/tpch_sf1/part/part-0.parquet";
let expected = format!(
"{}{}Code{}arrow-datafusion{}benchmarks{}data{}tpch_sf1{}part{}part-0.parquet",
"{}{}Code{}datafusion{}benchmarks{}data{}tpch_sf1{}part{}part-0.parquet",
test_home_path,
MAIN_SEPARATOR,
MAIN_SEPARATOR,
Expand Down
2 changes: 1 addition & 1 deletion datafusion-examples/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,7 @@ To run the examples, use the `cargo run` command, such as:

```bash
git clone https://github.com/apache/datafusion
cd arrow-datafusion
cd datafusion
# Download test data
git submodule update --init

Expand Down
2 changes: 1 addition & 1 deletion datafusion-examples/examples/flight/flight_sql_server.rs
Original file line number Diff line number Diff line change
Expand Up @@ -73,7 +73,7 @@ macro_rules! status {
///
/// JDBC connection string: "jdbc:arrow-flight-sql://127.0.0.1:50051/"
///
/// Based heavily on Ballista's implementation: https://github.com/apache/arrow-ballista/blob/main/ballista/scheduler/src/flight_sql.rs
/// Based heavily on Ballista's implementation: https://github.com/apache/datafusion-ballista/blob/main/ballista/scheduler/src/flight_sql.rs
/// and the example in arrow-rs: https://github.com/apache/arrow-rs/blob/master/arrow-flight/examples/flight_sql_server.rs
///
#[tokio::main]
Expand Down
2 changes: 1 addition & 1 deletion datafusion/sqllogictest/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -225,7 +225,7 @@ query <type_string> <sort_mode>
<expected_result>
```

- `test_name`: Uniquely identify the test name (arrow-datafusion only)
- `test_name`: Uniquely identify the test name (Datafusion only)
- `type_string`: A short string that specifies the number of result columns and the expected datatype of each result
column. There is one character in the <type_string> for each result column. The characters codes are:
- 'B' - **B**oolean,
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -142,21 +142,21 @@ fn normalize_paths(mut row: Vec<String>) -> Vec<String> {
fn workspace_root() -> &'static object_store::path::Path {
static WORKSPACE_ROOT_LOCK: OnceLock<object_store::path::Path> = OnceLock::new();
WORKSPACE_ROOT_LOCK.get_or_init(|| {
// e.g. /Software/arrow-datafusion/datafusion/core
// e.g. /Software/datafusion/datafusion/core
let dir = PathBuf::from(env!("CARGO_MANIFEST_DIR"));

// e.g. /Software/arrow-datafusion/datafusion
// e.g. /Software/datafusion/datafusion
let workspace_root = dir
.parent()
.expect("Can not find parent of datafusion/core")
// e.g. /Software/arrow-datafusion
// e.g. /Software/datafusion
.parent()
.expect("parent of datafusion")
.to_string_lossy();

let sanitized_workplace_root = if cfg!(windows) {
// Object store paths are delimited with `/`, e.g. `D:/a/arrow-datafusion/arrow-datafusion/testing/data/csv/aggregate_test_100.csv`.
// The default windows delimiter is `\`, so the workplace path is `D:\a\arrow-datafusion\arrow-datafusion`.
// Object store paths are delimited with `/`, e.g. `/datafusion/datafusion/testing/data/csv/aggregate_test_100.csv`.
// The default windows delimiter is `\`, so the workplace path is `datafusion\datafusion`.
workspace_root
.replace(std::path::MAIN_SEPARATOR, object_store::path::DELIMITER)
} else {
Expand Down
8 changes: 4 additions & 4 deletions dev/release/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -223,7 +223,7 @@ Here is my vote:
+1

[1]: https://github.com/apache/datafusion/tree/a5dd428f57e62db20a945e8b1895de91405958c4
[2]: https://dist.apache.org/repos/dist/dev/arrow/apache-arrow-datafusion-5.1.0
[2]: https://dist.apache.org/repos/dist/dev/arrow/apache-datafusion-5.1.0
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This isn't quite right (this path doesn't exist) but I think it will be fixed when we update the release scripts

[3]: https://github.com/apache/datafusion/blob/a5dd428f57e62db20a945e8b1895de91405958c4/CHANGELOG.md
```

Expand All @@ -249,7 +249,7 @@ NOTE: steps in this section can only be done by PMC members.
### After the release is approved

Move artifacts to the release location in SVN, e.g.
https://dist.apache.org/repos/dist/release/arrow/arrow-datafusion-5.1.0/, using
https://dist.apache.org/repos/dist/release/datafusion/datafusion-5.1.0/, using
the `release-tarball.sh` script:

```shell
Expand Down Expand Up @@ -437,7 +437,7 @@ svn ls https://dist.apache.org/repos/dist/dev/arrow | grep datafusion
Delete a release candidate:

```bash
svn delete -m "delete old DataFusion RC" https://dist.apache.org/repos/dist/dev/arrow/apache-arrow-datafusion-7.1.0-rc1/
svn delete -m "delete old DataFusion RC" https://dist.apache.org/repos/dist/dev/datafusion/apache-datafusion-7.1.0-rc1/
```

#### Deleting old releases from `release` svn
Expand All @@ -453,7 +453,7 @@ svn ls https://dist.apache.org/repos/dist/release/arrow | grep datafusion
Delete a release:

```bash
svn delete -m "delete old DataFusion release" https://dist.apache.org/repos/dist/release/arrow/arrow-datafusion-7.0.0
svn delete -m "delete old DataFusion release" https://dist.apache.org/repos/dist/release/datafusion/datafusion-7.0.0
```

### Publish the User Guide to the Arrow Site
Expand Down
2 changes: 1 addition & 1 deletion docs/source/contributor-guide/communication.md
Original file line number Diff line number Diff line change
Expand Up @@ -37,7 +37,7 @@ We use the Slack and Discord platforms for informal discussions and coordination
meet other contributors and get guidance on where to contribute. It is important to note that any technical designs and
decisions are made fully in the open, on GitHub.

Most of us use the `#arrow-datafusion` and `#arrow-rust` channels in the [ASF Slack workspace](https://s.apache.org/slack-invite) .
Most of us use the `#datafusion` and `#arrow-rust` channels in the [ASF Slack workspace](https://s.apache.org/slack-invite) .
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I have just renamed the slack channel to match this documentation change

Unfortunately, due to spammers, the ASF Slack workspace requires an invitation to join. To get an invitation,
request one in the `Arrow Rust` channel of the [Arrow Rust Discord server](https://discord.gg/Qw5gKqHxUM).

Expand Down
2 changes: 1 addition & 1 deletion docs/source/user-guide/example-usage.md
Original file line number Diff line number Diff line change
Expand Up @@ -274,7 +274,7 @@ backtrace: 0: std::backtrace_rs::backtrace::libunwind::trace
3: std::backtrace::Backtrace::capture
at /rustc/5680fa18feaa87f3ff04063800aec256c3d4b4be/library/std/src/backtrace.rs:298:9
4: datafusion_common::error::DataFusionError::get_back_trace
at /arrow-datafusion/datafusion/common/src/error.rs:436:30
at /datafusion/datafusion/common/src/error.rs:436:30
5: datafusion_sql::expr::function::<impl datafusion_sql::planner::SqlToRel<S>>::sql_function_to_expr
............
```
Expand Down
2 changes: 1 addition & 1 deletion docs/source/user-guide/faq.md
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,7 @@ DataFusion is a library for executing queries in-process using the Apache Arrow
model and computational kernels. It is designed to run within a single process, using threads
for parallel query execution.

[Ballista](https://github.com/apache/arrow-ballista) is a distributed compute platform built on DataFusion.
[Ballista](https://github.com/apache/datafusion-ballista) is a distributed compute platform built on DataFusion.

# How does DataFusion Compare with `XYZ`?

Expand Down
4 changes: 2 additions & 2 deletions docs/source/user-guide/introduction.md
Original file line number Diff line number Diff line change
Expand Up @@ -95,7 +95,7 @@ Here are some active projects using DataFusion:
<!-- "Active" means github repositories that had at least one commit in the last 6 months -->

- [Arroyo](https://github.com/ArroyoSystems/arroyo) Distributed stream processing engine in Rust
- [Ballista](https://github.com/apache/arrow-ballista) Distributed SQL Query Engine
- [Ballista](https://github.com/apache/datafusion-ballista) Distributed SQL Query Engine
- [Comet](https://github.com/apache/datafusion-comet) Apache Spark native query execution plugin
- [CnosDB](https://github.com/cnosdb/cnosdb) Open Source Distributed Time Series Database
- [Cube Store](https://github.com/cube-js/cube.js/tree/master/rust)
Expand Down Expand Up @@ -129,7 +129,7 @@ Here are some less active projects that used DataFusion:
- [Flock](https://github.com/flock-lab/flock)
- [Tensorbase](https://github.com/tensorbase/tensorbase)

[ballista]: https://github.com/apache/arrow-ballista
[ballista]: https://github.com/apache/datafusion-ballista
[blaze]: https://github.com/blaze-init/blaze
[cloudfuse buzz]: https://github.com/cloudfuse-io/buzz-rust
[cnosdb]: https://github.com/cnosdb/cnosdb
Expand Down