-
Notifications
You must be signed in to change notification settings - Fork 11
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
1. 초기 문서 추가 번역 + RDD Programming Guide 링크 삭제 + API Docs 링크 수정. 2. Header, Sidebar 번역. 3. 기타 자잘한 수정 + 일관성 문제 해결.
- Loading branch information
1 parent
26d7cc7
commit 421770f
Showing
12 changed files
with
71 additions
and
80 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,74 +1,74 @@ | ||
- text: Getting Started | ||
- text: 시작하기 | ||
url: sql-getting-started.html | ||
subitems: | ||
- text: "Starting Point: SparkSession" | ||
- text: "시작점: SparkSession" | ||
url: sql-getting-started.html#starting-point-sparksession | ||
- text: Creating DataFrames | ||
- text: DataFrame 생성하기 | ||
url: sql-getting-started.html#creating-dataframes | ||
- text: Untyped Dataset Operations (DataFrame operations) | ||
- text: 타입이 없는 Dataset 동작 (DataFrame 동작) | ||
url: sql-getting-started.html#untyped-dataset-operations-aka-dataframe-operations | ||
- text: Running SQL Queries Programmatically | ||
- text: 응용 프로그램 안에서 SQL 쿼리 실행하기 | ||
url: sql-getting-started.html#running-sql-queries-programmatically | ||
- text: Global Temporary View | ||
- text: 전역 임시 뷰 | ||
url: sql-getting-started.html#global-temporary-view | ||
- text: Creating Datasets | ||
- text: Dataset 생성하기 | ||
url: sql-getting-started.html#creating-datasets | ||
- text: Interoperating with RDDs | ||
- text: RDD 연동하기 | ||
url: sql-getting-started.html#interoperating-with-rdds | ||
- text: Aggregations | ||
- text: 집계(Aggregations) | ||
url: sql-getting-started.html#aggregations | ||
- text: Data Sources | ||
- text: 데이터 소스 | ||
url: sql-data-sources.html | ||
subitems: | ||
- text: "Generic Load/Save Functions" | ||
- text: "일반 불러오기/저장하기 함수" | ||
url: sql-data-sources-load-save-functions.html | ||
- text: Parquet Files | ||
- text: Parquet 파일 | ||
url: sql-data-sources-parquet.html | ||
- text: ORC Files | ||
- text: ORC 파일 | ||
url: sql-data-sources-orc.html | ||
- text: JSON Files | ||
- text: JSON 파일 | ||
url: sql-data-sources-json.html | ||
- text: Hive Tables | ||
- text: Hive 테이블 | ||
url: sql-data-sources-hive-tables.html | ||
- text: JDBC To Other Databases | ||
- text: JDBC를 통한 다른 데이터베이스 사용하기 | ||
url: sql-data-sources-jdbc.html | ||
- text: Avro Files | ||
- text: Avro 파일 | ||
url: sql-data-sources-avro.html | ||
- text: Troubleshooting | ||
- text: 문제 해결 | ||
url: sql-data-sources-troubleshooting.html | ||
- text: Performance Tuning | ||
- text: 성능 튜닝 | ||
url: sql-performance-tuning.html | ||
subitems: | ||
- text: Caching Data In Memory | ||
- text: 메모리에 데이터 캐싱하기 | ||
url: sql-performance-tuning.html#caching-data-in-memory | ||
- text: Other Configuration Options | ||
- text: 기타 설정 옵션 | ||
url: sql-performance-tuning.html#other-configuration-options | ||
- text: Broadcast Hint for SQL Queries | ||
- text: SQL 쿼리를 위한 브로드캐스트 힌트 | ||
url: sql-performance-tuning.html#broadcast-hint-for-sql-queries | ||
- text: Distributed SQL Engine | ||
- text: 분산 SQL 엔진 | ||
url: sql-distributed-sql-engine.html | ||
subitems: | ||
- text: "Running the Thrift JDBC/ODBC server" | ||
- text: "Thrift JDBC/ODBC 서버 실행하기" | ||
url: sql-distributed-sql-engine.html#running-the-thrift-jdbcodbc-server | ||
- text: Running the Spark SQL CLI | ||
- text: 스파크 SQL CLI 실행하기 | ||
url: sql-distributed-sql-engine.html#running-the-spark-sql-cli | ||
- text: PySpark Usage Guide for Pandas with Apache Arrow | ||
- text: 아파치 애로우(Arrow)와 Pandas를 위한 PySpark 사용 가이드 | ||
url: sql-pyspark-pandas-with-arrow.html | ||
subitems: | ||
- text: Apache Arrow in Spark | ||
- text: 스파크에서의 아파치 애로우 | ||
url: sql-pyspark-pandas-with-arrow.html#apache-arrow-in-spark | ||
- text: "Enabling for Conversion to/from Pandas" | ||
- text: "Pandas와의 변환 활성화하기" | ||
url: sql-pyspark-pandas-with-arrow.html#enabling-for-conversion-tofrom-pandas | ||
- text: "Pandas UDFs (a.k.a. Vectorized UDFs)" | ||
- text: "Pandas UDF (일명 ‘벡터화된 UDF’)" | ||
url: sql-pyspark-pandas-with-arrow.html#pandas-udfs-aka-vectorized-udfs | ||
- text: Usage Notes | ||
- text: 유의 사항 | ||
url: sql-pyspark-pandas-with-arrow.html#usage-notes | ||
- text: Reference | ||
- text: 참조 | ||
url: sql-reference.html | ||
subitems: | ||
- text: Data Types | ||
- text: 데이터 타입 | ||
url: sql-reference.html#data-types | ||
- text: NaN Semantics | ||
- text: NaN 의미 구조 | ||
url: sql-reference.html#nan-semantics | ||
- text: Arithmetic operations | ||
- text: 산술 연산 | ||
url: sql-reference.html#arithmetic-operations |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,36 +1,29 @@ | ||
--- | ||
layout: global | ||
displayTitle: Spark Overview | ||
displayTitle: 개요 | ||
title: Overview | ||
description: Apache Spark SPARK_VERSION_SHORT documentation homepage | ||
--- | ||
|
||
**Programming Guides:** | ||
**프로그래밍 가이드:** | ||
|
||
* [빠른 시작](quick-start.html): a quick introduction to the Spark API; start here! | ||
* [RDD Programming Guide](rdd-programming-guide.html): overview of Spark basics - RDDs (core but old API), accumulators, and broadcast variables | ||
* [스파크 SQL, DataFrame, Dataset 가이드](sql-programming-guide.html): processing structured data with relational queries (newer API than RDDs) | ||
* [구조화된 스트리밍](structured-streaming-programming-guide.html): processing structured data streams with relation queries (using Datasets and DataFrames, newer API than DStreams) | ||
* [빠른 시작](quick-start.html): 스파크 API에 대한 초간단 설명입니다. 입문자는 여기부터 읽으세요. | ||
* [스파크 SQL, DataFrame, Dataset](sql-programming-guide.html): 관계형 쿼리(relational quey)를 사용해서 구조화된 데이터(structured data) 처리하기. | ||
* [구조적 스트리밍](structured-streaming-programming-guide.html): 관계형 쿼리(relational quey)를 사용해서 구조화된 데이터 스트림(structured data stream) 처리하기. | ||
|
||
**API Docs:** | ||
**API 문서:** | ||
|
||
* [Spark Scala API (Scaladoc)](https://spark.apache.org/docs/latest/api/scala/index.html#org.apache.spark.package) | ||
* [Spark Java API (Javadoc)](https://spark.apache.org/docs/latest/api/java/index.html) | ||
* [Spark Python API (Sphinx)](https://spark.apache.org/docs/latest/api/python/index.html) | ||
* [Spark R API (Roxygen2)](https://spark.apache.org/docs/latest/api/R/index.html) | ||
* [Spark SQL, Built-in Functions (MkDocs)](https://spark.apache.org/docs/latest/api/sql/index.html) | ||
* [Scala API 문서 (Scaladoc)](https://spark.apache.org/docs/latest/api/scala/index.html#org.apache.spark.package) | ||
* [Java API 문서 (Javadoc)](https://spark.apache.org/docs/latest/api/java/index.html) | ||
* [Python API 문서 (Sphinx)](https://spark.apache.org/docs/latest/api/python/index.html) | ||
* [R API 문서 (Roxygen2)](https://spark.apache.org/docs/latest/api/R/index.html) | ||
* [스파크 SQL 함수 문서 (MkDocs)](https://spark.apache.org/docs/latest/api/sql/index.html) | ||
|
||
**External Resources:** | ||
**기타 자료:** | ||
|
||
* [Spark Homepage](https://spark.apache.org) | ||
* [Spark Community](https://spark.apache.org/community.html) resources, including local meetups | ||
* [StackOverflow tag `apache-spark`](http://stackoverflow.com/questions/tagged/apache-spark) | ||
* [Mailing Lists](https://spark.apache.org/mailing-lists.html): ask questions about Spark here | ||
* [AMP Camps](http://ampcamp.berkeley.edu/): a series of training camps at UC Berkeley that featured talks and | ||
exercises about Spark, Spark Streaming, Mesos, and more. [Videos](http://ampcamp.berkeley.edu/6/), | ||
[slides](http://ampcamp.berkeley.edu/6/) and [exercises](http://ampcamp.berkeley.edu/6/exercises/) are | ||
available online for free. | ||
* [Code Examples](https://spark.apache.org/examples.html): more are also available in the `examples` subfolder of Spark ([Scala]({{site.SPARK_GITHUB_URL}}/tree/master/examples/src/main/scala/org/apache/spark/examples), | ||
* [스파크 공식 웹사이트 (영어)](https://spark.apache.org) | ||
* [예제 코드 (영어)](https://spark.apache.org/examples.html): 스파크 프로젝트의 `examples` 디렉토리에서 더 많은 예제를 볼 수 있습니다. ([Scala]({{site.SPARK_GITHUB_URL}}/tree/master/examples/src/main/scala/org/apache/spark/examples), | ||
[Java]({{site.SPARK_GITHUB_URL}}/tree/master/examples/src/main/java/org/apache/spark/examples), | ||
[Python]({{site.SPARK_GITHUB_URL}}/tree/master/examples/src/main/python), | ||
[R]({{site.SPARK_GITHUB_URL}}/tree/master/examples/src/main/r)) | ||
* [한국 스파크 사용자 모임 (Facebook)](https://www.facebook.com/groups/sparkkoreauser/) |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Oops, something went wrong.