Ambari

Installing Hadoop Using Apache Ambari

2013-01-14

Contents

1. Getting Ready to Install
1. Understand the Basics
2. Meet Minimum System Requirements
2.1. Hardware Recommendations
2.2. Operating Systems Requirements
2.3. Browser Requirements
2.4. Software Requirements
2.5. Database Requirements
3. Decide on Deployment Type
4. Collect Information
5. Prepare the Environment
5.1. Check Existing Installs
5.2. Set Up Password-less SSH
5.3. Enable NTP on the Cluster
5.4. Check DNS
5.5. Disable SELinux
5.6. Disable iptables
6. Optional: Configure the Local Repositories
2. Running the Installer
1. Set Up the Bits
1.1. RHEL/CentOS 5.x
1.2. RHEL/CentOS 6.x
1.3. SLES 11
2. Set Up the Server
2.1. Setup Options
3. Start the Ambari Server
3. Installing, Configuring, and Deploying the Cluster
1. Log into Apache Ambari
2. Welcome
3. Install Options
4. Confirm Hosts
5. Choose Services
6. Assign Masters
7. Assign Slaves and Clients
8. Customize Services
8.1. HDFS
8.2. MapReduce
8.3. Hive/HCat
8.4. WebHCat
8.5. HBase
8.6. ZooKeeper
8.7. Oozie
8.8. Nagios
8.9. Misc
8.10. Recommended Memory Configurations for the MapReduce Service
9. Review
10. Install, Start and Test
11. Summary
4. Troubleshooting Ambari Deployments
1. Getting the Logs
2. Quick Checks
3. Specific Issues
3.1. Problem: Browser crashed before Install Wizard completed"
3.2. Install Wizard reports that the cluster install has failed
3.3. Problem: “Unable to create new native thread” exceptions in HDFS DataNode logs or those of any system daemon
3.4. Problem: The “yum install ambari-server” Command Fails
3.5. Problem: HDFS Smoke Test Fails
3.6. Problem: The HCatalog Daemon Metastore Smoke Test Fails
3.7. Problem: MySQL and Nagios fail to install on RightScale CentOS 5 images on EC2
3.8. Trouble starting Ambari on system reboot
5. Appendix: Installing Ambari Agents Manually
1. RHEL/CentOS v. 5.x and 6.x
2. SLES

loading table of contents...