HDP Data Services

Hortonworks Data Platform

2015-12-21


Contents

1. Using Apache Hive
1. Hive Documentation
2. Features Overview
2.1. Temporary Tables
2.2. Cost-Based SQL Optimization
2.3. Optimized Row Columnar (ORC) Format
2.4. Streaming Data Ingestion
2.5. Query Vectorization
2.6. Comparing Beeline to the Hive CLI
3. Moving Data into Hive
3.1. Moving Data from HDFS to Hive Using an External Table
3.2. Using Sqoop to Move Data into Hive
3.3. Incrementally Updating a Hive Table Using Sqoop and an External Table
4. Hive JDBC and ODBC Drivers
5. Configuring HiveServer2 for Transactions (ACID Support)
6. Configuring HiveServer2 for LDAP and for LDAP over SSL
7. Troubleshooting Hive
8. Hive JIRAs
2. SQL Compliance
1. INSERT ... VALUES, UPDATE, and DELETE SQL Statements
2. SQL Standard-based Authorization with GRANT And REVOKE SQL Statements
3. Transactions
4. Subqueries
5. Common Table Expressions
6. Quoted Identifiers in Column Names
7. CHAR Data Type Support
3. Running Pig with the Tez Execution Engine
4. Using HDP for Metadata Services (HCatalog)
1. Using HCatalog
2. Using WebHCat
3. Security for WebHCat
5. Using Apache HBase
1. Cell-level Access Control Lists (ACLs)
2. Column Family Encryption
3. Tuning Region Server
6. Using HDP for Workflow and Scheduling (Oozie)
7. Using Apache Sqoop
1. Apache Sqoop Connectors
2. Sqoop Import Table Commands
3. Netezza Connector
4. Sqoop-HCatalog Integration
5. Controlling Transaction Isolation
6. Automatic Table Creation
7. Delimited Text Formats and Field and Line Delimiter Characters
8. HCatalog Table Requirements
9. Support for Partitioning
10. Schema Mapping
11. Support for HCatalog Data Types
12. Providing Hive and HCatalog Libraries for the Sqoop Job
13. Examples

loading table of contents...