Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Enhance utoipa-swagger-ui build #936

Merged
merged 1 commit into from
May 21, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
23 changes: 21 additions & 2 deletions utoipa-swagger-ui/Cargo.toml
Original file line number Diff line number Diff line change
Expand Up @@ -12,15 +12,20 @@ authors = ["Juha Kukkonen <[email protected]>"]
rust-version.workspace = true

[features]
default = ["url"]
debug = []
debug-embed = ["rust-embed/debug-embed"]
reqwest = ["dep:reqwest"]
url = ["dep:url"]
Comment on lines +18 to +19
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't see anything feature-flagged on url, so it looks like it's just always required?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think there are some places like here.

But in principle at least, url is necessary to parse SWAGGER_UI_DOWNLOAD_URL in all cases except in the case of vendored swagger_ui.zip so maybe the vendored flag is enough ?

Copy link
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

True, would be enough at this point, and would work without explicit url feature since the optional dependencies becomes implicit feature flags anyways. But it does not matter.


[dependencies]
rust-embed = { version = "8" }
mime_guess = { version = "2.0" }
actix-web = { version = "4", optional = true, default-features = false }
rocket = { version = "0.5", features = ["json"], optional = true }
axum = { version = "0.7", default-features = false, features = ["json"], optional = true }
axum = { version = "0.7", default-features = false, features = [
"json",
], optional = true }
utoipa = { version = "4", path = "../utoipa" }
serde = { version = "1.0", features = ["derive"] }
serde_json = { version = "1.0" }
Expand All @@ -35,4 +40,18 @@ rustdoc-args = ["--cfg", "doc_cfg"]
[build-dependencies]
zip = { version = "1", default-features = false, features = ["deflate"] }
regex = "1.7"
reqwest = { version = "0.12", features = ["blocking", "rustls-tls"], default-features = false }

# enabled optionally to allow rust only build with expense of bigger dependency tree and platform
# independant build. By default `curl` system package is tried for downloading the Swagger UI.
reqwest = { version = "0.12", features = [
"blocking",
"rustls-tls",
], default-features = false, optional = true }
url = { version = "2.5", optional = true }

# Windows is forced to use reqwest to download the Swagger UI.
[target.'cfg(windows)'.build-dependencies]
reqwest = { version = "0.12", features = [
"blocking",
"rustls-tls",
], default-features = false }
11 changes: 10 additions & 1 deletion utoipa-swagger-ui/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -30,6 +30,9 @@ more details at [serve](https://docs.rs/utoipa-swagger-ui/latest/utoipa_swagger_
hassle free.
* **debug-embed** Enables `debug-embed` feature on `rust_embed` crate to allow embedding files in debug
builds as well.
* **reqwest** Use `reqwest` for downloading Swagger UI accoring to the `SWAGGER_UI_DOWNLOAD_URL` environment
variable. This is only enabled by default on _Windows_.
* **url** Enabled by default for parsing and encoding the download URL.

## Install

Expand All @@ -49,7 +52,13 @@ utoipa-swagger-ui = { version = "7", features = ["actix-web"] }

**Note!** Also remember that you already have defined `utoipa` dependency in your `Cargo.toml`

## Config
## Build Config

_`utoipa-swagger-ui` crate will by default try to use system `curl` package for downloading the Swagger UI. It
can optionally be downloaded with `reqwest` by enabling `reqwest` feature. On Windows the `reqwest` feature
is enabled by default. Reqwest can be useful for platform independent builds however bringing quite a few
unnecessary dependencies just to download a file. If the `SWAGGER_UI_DOWNLOAD_URL` is a file path then no
downloading will happen._

The following configuration env variables are available at build time:

Expand Down
111 changes: 85 additions & 26 deletions utoipa-swagger-ui/build.rs
Original file line number Diff line number Diff line change
@@ -1,10 +1,10 @@
use reqwest::Url;
use std::{
env,
error::Error,
fs::{self, File},
io::{self, Read},
path::PathBuf,
io,
path::{Path, PathBuf},
process::Command,
};

use regex::Regex;
Expand All @@ -22,34 +22,40 @@ use zip::{result::ZipError, ZipArchive};
const SWAGGER_UI_DOWNLOAD_URL_DEFAULT: &str =
"https://github.com/swagger-api/swagger-ui/archive/refs/tags/v5.17.3.zip";

const SWAGGER_UI_DOWNLOAD_URL: &str = "SWAGGER_UI_DOWNLOAD_URL";
const SWAGGER_UI_OVERWRITE_FOLDER: &str = "SWAGGER_UI_OVERWRITE_FOLDER";

fn main() {
let target_dir = env::var("OUT_DIR").unwrap();
println!("OUT_DIR: {}", target_dir);
println!("OUT_DIR: {target_dir}");

let url =
env::var("SWAGGER_UI_DOWNLOAD_URL").unwrap_or(SWAGGER_UI_DOWNLOAD_URL_DEFAULT.to_string());
env::var(SWAGGER_UI_DOWNLOAD_URL).unwrap_or(SWAGGER_UI_DOWNLOAD_URL_DEFAULT.to_string());

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

url is parsed into url::Url on every branch taken (explicitly for file://, explicitly for curl, and implicitly via IntoUrl for reqwest).

Doing it up here instead would also give access to Url::schema instead of approximating it with str::starts_with.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Indeed.

Copy link
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

True that. Maybe something that can be considered in future if it starts to bother someone 😆 Not going to edit it for now though

println!("SWAGGER_UI_DOWNLOAD_URL: {}", url);
println!("{SWAGGER_UI_DOWNLOAD_URL}: {url}");
let zip_filename = url.split('/').last().unwrap().to_string();
let zip_path = [&target_dir, &zip_filename].iter().collect::<PathBuf>();

if !zip_path.exists() {
if url.starts_with("http://") || url.starts_with("https://") {
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I assume the cache is still desired for HTTP(S) downloads.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes: I understand that this is covered by println!("cargo:rerun-if-env-changed={SWAGGER_UI_DOWNLOAD_URL}"); here, right ?

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sort of, rerun-if-env-changed tells cargo about more reasons to rebuild, while the old cache was trying to avoid redownloading when it was rebuilt. (On the other hand, I'm not sure how useful that was anyway given that I'm not sure how Cargo decides on when to recreate out folders.)

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

To be clear: I'm fine with the changed behaviour, just wanted to double check that it was intended.

Copy link
Owner Author

@juhaku juhaku May 26, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah, I removed it just because it appears to be not consistent on creating the folders thus even if there was the cache form previous builds, cargo might still create new set of folders when building thus invalidating the cache in a mean time as well. Perhaps just better of to let it download it when ever cargo should rebuild it. After all that is not going to be build unless one runs cargo clean.

println!("start download to : {:?}", zip_path);
download_file(&url, zip_path.clone()).unwrap();
} else if url.starts_with("file://") {
let file_path = Url::parse(&url).unwrap().to_file_path().unwrap();
println!("start copy to : {:?}", zip_path);
fs::copy(file_path, zip_path.clone()).unwrap();
} else {
panic!("invalid SWAGGER_UI_DOWNLOAD_URL: {} -> must start with http:// | https:// | file://", url);
}
if url.starts_with("file:") {
let mut file_path = url::Url::parse(&url).unwrap().to_file_path().unwrap();
file_path = fs::canonicalize(file_path).expect("swagger ui download path should exists");

// with file protocol utoipa swagger ui should compile when file changes
println!("cargo:rerun-if-changed={:?}", file_path);

println!("start copy to : {:?}", zip_path);
fs::copy(file_path, zip_path.clone()).unwrap();
} else if url.starts_with("http://") || url.starts_with("https://") {
Comment on lines +43 to +47
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

At this point this really just seems equivalent to zip_path = file_path;.

Copy link
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah, while this is little bit older commit but in practice yes, the copy seems to be superfluous since it is anyway extracted to the OUTPUT dir. But perhaps we can live with that 😆

println!("start download to : {:?}", zip_path);

// with http protocol we update when the 'SWAGGER_UI_DOWNLOAD_URL' changes
println!("cargo:rerun-if-env-changed={SWAGGER_UI_DOWNLOAD_URL}");

download_file(&url, zip_path.clone()).unwrap();
} else {
println!("already downloaded or copied: {:?}", zip_path);
panic!("invalid {SWAGGER_UI_DOWNLOAD_URL}: {url} -> must start with http:// | https:// | file:");
}

println!("cargo:rerun-if-changed={:?}", zip_path.clone());

let swagger_ui_zip =
File::open([&target_dir, &zip_filename].iter().collect::<PathBuf>()).unwrap();

Expand All @@ -63,10 +69,10 @@ fn main() {
write_embed_code(&target_dir, &zip_top_level_folder);

let overwrite_folder =
PathBuf::from(env::var("SWAGGER_UI_OVERWRITE_FOLDER").unwrap_or("overwrite".to_string()));
PathBuf::from(env::var(SWAGGER_UI_OVERWRITE_FOLDER).unwrap_or("overwrite".to_string()));

if overwrite_folder.exists() {
println!("SWAGGER_UI_OVERWRITE_FOLDER: {:?}", overwrite_folder);
println!("{SWAGGER_UI_OVERWRITE_FOLDER}: {overwrite_folder:?}");

for entry in fs::read_dir(overwrite_folder).unwrap() {
let entry = entry.unwrap();
Expand All @@ -75,10 +81,7 @@ fn main() {
overwrite_target_file(&target_dir, &zip_top_level_folder, path_in);
}
} else {
println!(
"SWAGGER_UI_OVERWRITE_FOLDER not found: {:?}",
overwrite_folder
);
println!("{SWAGGER_UI_OVERWRITE_FOLDER} not found: {overwrite_folder:?}");
}
}

Expand Down Expand Up @@ -170,6 +173,22 @@ struct SwaggerUiDist;
}

fn download_file(url: &str, path: PathBuf) -> Result<(), Box<dyn Error>> {
let reqwest_feature = env::var("CARGO_FEATURE_REQWEST");
println!("reqwest feature: {reqwest_feature:?}");
if reqwest_feature.is_ok() {
#[cfg(any(feature = "reqwest", target_os = "windows"))]
{
download_file_reqwest(url, path)?;
}
Ok(())
} else {
println!("trying to download using `curl` system package");
download_file_curl(url, path.as_path())
}
}

#[cfg(any(feature = "reqwest", target_os = "windows"))]
fn download_file_reqwest(url: &str, path: PathBuf) -> Result<(), Box<dyn Error>> {
let mut client_builder = reqwest::blocking::Client::builder();

if let Ok(cainfo) = env::var("CARGO_HTTP_CAINFO") {
Expand All @@ -189,13 +208,53 @@ fn download_file(url: &str, path: PathBuf) -> Result<(), Box<dyn Error>> {
Ok(())
}

#[cfg(any(feature = "reqwest", target_os = "windows"))]
fn parse_ca_file(path: &str) -> Result<reqwest::Certificate, Box<dyn Error>> {
let mut buf = Vec::new();
use io::Read;
File::open(path)?.read_to_end(&mut buf)?;
let cert = reqwest::Certificate::from_pem(&buf)?;
Ok(cert)
}

fn download_file_curl<T: AsRef<Path>>(url: &str, target_dir: T) -> Result<(), Box<dyn Error>> {
let url = url::Url::parse(url)?;

let mut args = Vec::with_capacity(6);
args.extend([
"-sSL",
"-o",
target_dir
.as_ref()
.as_os_str()
.to_str()
.expect("target dir should be valid utf-8"),
url.as_str(),
]);
let cacert = env::var("CARGO_HTTP_CAINFO").unwrap_or_default();
if !cacert.is_empty() {
args.extend(["--cacert", &cacert]);
}

let download = Command::new("curl")
.args(args)
.spawn()
.and_then(|mut child| child.wait());

Ok(download
.and_then(|status| {
if status.success() {
Ok(())
} else {
Err(std::io::Error::new(
io::ErrorKind::Other,
format!("curl download file exited with error status: {status}"),
))
}
})
.map_err(Box::new)?)
}

fn overwrite_target_file(target_dir: &str, swagger_ui_dist_zip: &str, path_in: PathBuf) {
let filename = path_in.file_name().unwrap().to_str().unwrap();
println!("overwrite file: {:?}", path_in.file_name().unwrap());
Expand Down
Loading