Welcome to ExamTopics
ExamTopics Logo
- Expert Verified, Online, Free.
 

Oracle 1z0-449 Exam Actual Questions

The questions for 1z0-449 were last updated on Nov. 21, 2024.
  • Viewing page 1 out of 18 pages.
  • Viewing questions 1-4 out of 72 questions

Topic 1 - Single Topic

Question #1 Topic 1

You need to place the results of a PigLatin script into an HDFS output directory.
What is the correct syntax in Apache Pig?

  • A. update hdfs set D as ‘./output’;
  • B. store D into ‘./output’;
  • C. place D into ‘./output’;
  • D. write D as ‘./output’;
  • E. hdfsstore D into ‘./output’;
Reveal Solution Hide Solution   Discussion  

Correct Answer: B 🗳️
Use the STORE operator to run (execute) Pig Latin statements and save (persist) results to the file system. Use STORE for production scripts and batch mode processing.
Syntax: STORE alias INTO 'directory' [USING function];
Example: In this example data is stored using PigStorage and the asterisk character (*) as the field delimiter.
A = LOAD 'data' AS (a1:int,a2:int,a3:int);
DUMP A;
(1,2,3)
(4,2,1)
(8,3,4)
(4,3,3)
(7,2,5)
(8,4,3)
STORE A INTO 'myoutput' USING PigStorage ('*');
CAT myoutput;
1*2*3
4*2*1
8*3*4
4*3*3
7*2*5
8*4*3
References:
https://pig.apache.org/docs/r0.13.0/basic.html#store

Question #2 Topic 1

How is Oracle Loader for Hadoop (OLH) better than Apache Sqoop?

  • A. OLH performs a great deal of preprocessing of the data on Hadoop before loading it into the database.
  • B. OLH performs a great deal of preprocessing of the data on the Oracle database before loading it into NoSQL.
  • C. OLH does not use MapReduce to process any of the data, thereby increasing performance.
  • D. OLH performs a great deal of preprocessing of the data on the Oracle database before loading it into Hadoop.
  • E. OLH is fully supported on the Big Data Appliance. Apache Sqoop is not supported on the Big Data Appliance.
Reveal Solution Hide Solution   Discussion  

Correct Answer: A 🗳️
Oracle Loader for Hadoop provides an efficient and high-performance loader for fast movement of data from a Hadoop cluster into a table in an Oracle database.
Oracle Loader for Hadoop prepartitions the data if necessary and transforms it into a database-ready format. It optionally sorts records by primary key or user- defined columns before loading the data or creating output files.
Note: Apache Sqoop(TM) is a tool designed for efficiently transferring bulk data between Apache Hadoop and structured datastores such as relational databases.
Incorrect Answers:
A, D: Oracle Loader for Hadoop provides an efficient and high-performance loader for fast movement of data from a Hadoop cluster into a table in an Oracle database.
C: Oracle Loader for Hadoop is a MapReduce application that is invoked as a command-line utility. It accepts the generic command-line options that are supported by the org.apache.hadoop.util.Tool interface.
E: The Oracle Linux operating system and Cloudera's Distribution including Apache Hadoop (CDH) underlie all other software components installed on Oracle Big
Data Appliance. CDH includes Apache projects for MapReduce and HDFS, such as Hive, Pig, Oozie, ZooKeeper, HBase, Sqoop, and Spark.
References:
https://docs.oracle.com/cd/E37231_01/doc.20/e36961/start.htm#BDCUG326 https://docs.oracle.com/cd/E55905_01/doc.40/e55814/concepts.htm#BIGUG117

Question #3 Topic 1

Which three pieces of hardware are present on each node of the Big Data Appliance? (Choose three.)

  • A. high capacity SAS disks
  • B. memory
  • C. redundant Power Delivery Units
  • D. InfiniBand ports
  • E. InfiniBand leaf switches
Reveal Solution Hide Solution   Discussion  

Correct Answer: ABD 🗳️
Big Data Appliance Hardware Specification and Details, example:
Per Node:
✑ 2 x Eight-Core Intel Xeon E5-2260 Processors (2.2 GHz)
✑ 64 GB Memory (expandable to 256GB)
✑ Disk Controller HBA with 512MB Battery backed write cache
✑ 12 x 3TB 7,200 RPM High Capacity SAS Disks
✑ 2 x QDR (Quad Data Rate InfiniBand)(40Gb/s) Ports
✑ 4 x 10 Gb Ethernet Ports
✑ 1 x ILOM Ethernet Port
References:
http://www.oracle.com/technetwork/server-storage/engineered-systems/bigdata-appliance/overview/bigdataappliancev2-datasheet-1871638.pdf

Question #4 Topic 1

What two actions do the following commands perform in the Oracle R Advanced Analytics for Hadoop Connector? (Choose two.) ore.connect (type="HIVE") ore.attach ()

  • A. Connect to Hive.
  • B. Attach the Hadoop libraries to R.
  • C. Attach the current environment to the search path of R.
  • D. Connect to NoSQL via Hive.
Reveal Solution Hide Solution   Discussion  

Correct Answer: AC 🗳️
You can connect to Hive and manage objects using R functions that have an ore prefix, such as ore.connect.
To attach the current environment into search path of R use:
ore.attach()
References:
https://docs.oracle.com/cd/E49465_01/doc.23/e49333/orch.htm#BDCUG400

Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...