Generic selectors
Exact matches only
Search in title
Search in content
Search in posts
Search in pages
Filter by Categories
nmims post
Objective Type Set
Online MCQ Assignment
Question Solution
Solved Question
Uncategorized

Interview MCQ Set 1

1. Which of the following standard does Azure use ?
a) REST
b) XML
c) HTML
d) All of the mentioned

View Answer

Answer: d [Reason:] The Azure Windows Services Platform API uses the industry standard REST, HTTP, and XML protocols that are part of any Service Oriented Architecture cloud infrastructure to allow applications to talk to Azure.

2. Point out the wrong statement:
a) An Amazon Machine Image can be provisioned with an operating system, an enterprise application, or application stack
b) AWS is a deployment enabler
c) Google Apps lets you create a scalable cloud-based application
d) None of the mentioned

View Answer

Answer: d [Reason:] Google application can only work within the Google infrastructure, and the application is not easily ported to other environments.

3. What does IPsec in Azure platform refers to ?
a) Internet Protocol Security protocol suite
b) Internet Standard
c) Commodity servers
d) All of the mentioned

View Answer

Answer: a [Reason:] IPsec refers to the Internet Protocol Security protocol suite for creating a secure Internet connection between two endpoints.

4. Which of the following web applications can be deployed with Azure ?
a) ASP.NET
b) PHP
c) WCF
d) All of the mentioned

View Answer

Answer: d [Reason:] Microsoft also has released SDKs for both Java and Ruby to allow applications written in those languages to place calls to the Azure Service Platform API to the AppFabric Service.

5. Point out the correct statement:
a) The Windows Azure service itThe Windows Azure Platform allows a developer to modify his application so it can run in the cloud on virtual machines hosted in Microsoft datacenters
b) Windows Azure serves as a cloud operating system
c) With Azure’s architecture, an application can run locally, run in the cloud, or some combination of both
d) All of the mentioned

View Answer

Answer: d [Reason:] Applications on Azure can be run as applications, as background processes or services, or as both.

6. A _________ role is a virtual machine instance running Microsoft IIS Web server that can accept and respond to HTTP or HTTPS requests.
a) Web
b) Server
c) Worker
d) Client

View Answer

Answer: a [Reason:] Worker roles can communicate with Azure Storage or through direct connections to clients.

7. Which of the following element allows you to create and manage virtual machines that serve either in a Web role and a Worker role ?
a) Compute
b) Application
c) Storage
d) None of the mentioned

View Answer

Answer: a [Reason:] Compute is the load-balanced Windows server computation and policy engine.

8. Which of the following element is a non-relational storage system for large-scale storage ?
a) Compute
b) Application
c) Storage
d) None of the mentioned

View Answer

Answer: c [Reason:] Azure Storage Service lets you create drives, manage queues, and store BLOBs.

9. Azure Storage plays the same role in Azure that ______ plays in Amazon Web Services.
a) S3
b) EC2
c) EC3
d) All of the mentioned

View Answer

Answer: a [Reason:] For relational database services, SQL Azure may be used.

10. Which of the following element in Azure stands for management service ?
a) config
b) application
c) virtual machines
d) none of the mentioned

View Answer

Answer: a [Reason:] Virtual machines are instances of Windows that run the applications and services that are part of a particular deployment.

Interview MCQ Set 2

1. ___________ is the world’s most complete, tested, and popular distribution of Apache Hadoop and related projects.
a) MDH
b) CDH
c) ADH
d) BDH

View Answer

Answer: b [Reason:] Cloudera’s open-source Apache Hadoop distribution, CDH (Cloudera Distribution Including Apache Hadoop), targets enterprise-class deployments of that technology.

2. Point out the correct statement :
a) Cloudera is also a sponsor of the Apache Software Foundation
b) CDH is 100% Apache-licensed open source and is the only Hadoop solution to offer unified batch processing, interactive SQL, and interactive search, and role-based access controls
c) More enterprises have downloaded CDH than all other such distributions combined
d) All of the mentioned

View Answer

Answer: d [Reason:] Cloudera says that more than 50% of its engineering output is donated upstream to the various Apache-licensed open source projects.

3. Cloudera ___________ includes CDH and an annual subscription license (per node) to Cloudera Manager and technical support.
a) Enterprise
b) Express
c) Standard
d) All of the mentioned

View Answer

Answer: a [Reason:] CDH includes the core elements of Apache Hadoop plus several additional key open source projects.

4. Cloudera Express includes CDH and a version of Cloudera ___________ lacking enterprise features such as rolling upgrades and backup/disaster recovery.
a) Enterprise
b) Express
c) Standard
d) Manager

View Answer

Answer: d [Reason:] All versions may be downloaded from Cloudera’s website.

5. Point out the wrong statement :
a) CDH contains the main, core elements of Hadoop
b) In October 2012, Cloudera announced the Cloudera Impala project
c) CDH may be downloaded from Cloudera’s website at no charge
d) None of the mentioned

View Answer

Answer: d [Reason:] CDH may be downloaded from Cloudera’s website with no technical support nor Cloudera Manager.

6. Cloudera Enterprise comes in ___________ edition .
a) One
b) Two
c) Three
d) Four

View Answer

Answer: c [Reason:] Cloudera Enterprise comes in three editions: Basic, Flex, and Data Hub.

7. __________ is a online NoSQL developed by Cloudera.
a) HCatalog
b) Hbase
c) Imphala
d) Oozie

View Answer

Answer: b [Reason:] HBase is a distributed key value store.

8. _______ is an open source set of libraries, tools, examples, and documentation engineered.
a) Kite
b) Kize
c) Ookie
d) All of the mentioned

View Answer

Answer: a [Reason:] Kite is used to simplify the most common tasks when building applications on top of Hadoop.

9. To configure short-circuit local reads, you will need to enable ____________ on local Hadoop.
a) librayhadoop
b) libhadoop
c) libhad
d) none of the mentioned

View Answer

Answer: b [Reason:] Short-circuit reads make use of a UNIX domain socket.

10. CDH process and control sensitive data and facilitate :
a) multi-tenancy
b) flexibilty
c) scalabilty
d) all of the mentioned

View Answer

Answer: a [Reason:] Cloudera Express offers the fastest and easiest way to getting your Hadoop cluster up and running and exploring your first use cases.

Interview MCQ Set 3

1. PCollection, PTable, and PGroupedTable all support a __________ operation.
a) intersection
b) union
c) OR
d) None of the mentioned

View Answer

Answer: b [Reason:] Union operation takes a series of distinct PCollections that all have the same data type and treats them as a single virtual PCollection.

2. Point out the correct statement :
a) StreamPipeline executes the pipeline in-memory on the client
b) MemPipeline executes the pipeline by converting it to a series of Spark pipelines
c) MapReduce framework’s approach makes it easy for the framework to serialize data from the client to the cluster
d) All of the mentioned

View Answer

Answer: c [Reason:] SparkPipeline executes the pipeline by converting it to a series of Spark pipelines.

3. Crunch uses Java serialization to serialize the contents of all of the ______ in a pipeline definition
a) Transient
b) DoFns
c) Configuration
d) All of the mentioned

View Answer

Answer: b [Reason:] Dofus is a Flash based massively multiplayer online role-playing game (MMORPG) developed and published by Ankama Games.

4. Inline DoFn that splits a line up into words is an inner class :
a) Pipeline
b) MyPipeline
c) ReadPipeline
d) WritePipe

View Answer

Answer: b [Reason:] Inner classes contain references to their parent outer classes, so unless MyPipeline implements the Serializable interface, the NotSerializableException will be thrown when Crunch tries to serialize the inner DoFn.

5. Point out the wrong statement :
a) DoFns also have a number of helper methods for working with Hadoop Counters, all named increment
b) The Crunch APIs contain a number of useful subclasses of DoFn that handle common data processing scenarios and are easier to write and test
c) FilterFn class defines a single abstract method
d) None of the mentioned

View Answer

Answer: d [Reason:] Counters are an incredibly useful way of keeping track of the state of long-running data pipelines and detecting any exceptional conditions that occur during processing

6. DoFns provide direct access to the __________ object that is used within a given Map or Reduce task via the getContext method.
a) TaskInputContext
b) TaskInputOutputContext
c) TaskOutputContext
d) All of the mentioned

View Answer

Answer: b [Reason:] There are also a number of helper methods for working with the objects associated with the TaskInputOutputContext

7. The top-level ___________ package contains three of the most important specializations in Crunch.
a) org.apache.scrunch
b) org.apache.crunch
c) org.apache.kcrunch
d) all of the mentioned

View Answer

Answer: b [Reason:] Each of these specialized DoFn implementations has associated methods on the PCollection, PTable, and PGroupedTable interfaces to support common data processing steps.

8. The Avros class also has a _____ method for creating PTypes for POJOs using Avro’s reflection-based serialization mechanism.
a) spot
b) reflects
c) gets
d) all of the mentioned

View Answer

Answer: b [Reason:] There are a couple of restrictions on the structure of the POJO.

9. The ______________ class defines a configuration parameter named LINES_PER_MAP that controls how the input file is split.
a) NLineInputFormat
b) InputLineFormat
c) LineInputFormat
d) None of the mentioned

View Answer

Answer: a [Reason:] We can set the value of parameter via the Source interface’s inputConf method.

10. The ________ class allows developers to exercise precise control over how data is partitioned, sorted, and grouped by the underlying execution engine.
a) Grouping
b) GroupingOptions
c) RowGrouping
d) None of the mentioned

View Answer

Answer: b [Reason:] The GroupingOptions class is immutable.

Interview MCQ Set 4

1. What is the most basic level of storage
a) SAN
b) DAS
c) NAS
d) ISCSI

View Answer

Answer: b

2. A NAS solution is most appropriate for what type of data environment
a) Secured Access
b) Shared access
c) Remote access
d) Parallel access

View Answer

Answer: b

3. Which three statements describe differences between Storage Area Network (SAN) and Network Attached Storage (NAS) solutions? Choose three.
a) SAN is generally more expensive but provides higher performance
b) NAS uses TCP/IP for communication between hosts and the NAS server
c) NAS requires additional hardware on a host: a host bus adapter for connectivity
d) SAN uses proprietary protocols for communication between hosts and the SAN fabric

View Answer

Answer: a, b, d

4. I/O requests to disk storage on a SAN are called
a) File I/Os
b) SAN I/Os
c) Block I/Os
d) Disk I/Os

View Answer

Answer: c

5. Demerits of DAS are. Choose two.
a) Interconnect limited to 10km
b) Excessive network traffic
c) Distance limitations
d) Inability to share data with other servers

View Answer

Answer: c, d

6. Which topology is best suited for medium sized enterprise.
a) NAS
b) SAN
c) DAS
d) None of the mentioned

View Answer

Answer: a

7. Disk controller driver in DAS architecture is replaced in SAN either with ——
a) FC Protocol
b) iSCSI
c) TCP/IP stack
d) Any one of the mentioned

View Answer

Answer: d

8. Which storage technology requires downtime to add new hard disk capacity
a) DAS
b) SAN
c) NAS
d) None of the mentioned

View Answer

Answer: a

9. In SAN storage model, the operating system view storage resources as —— devices
a) FC
b) SCSI
c) SAN
d) None of the mentioned

View Answer

Answer: b

10. Identify a network file protocol in the below mentioned set.
a) FC
b) CIFS
c) SCSI
d) NAS

View Answer

Answer: b

Interview MCQ Set 5

1. This is a repository for the storage, management, and dissemination of data in which the mechanical, lighting, electrical and computer systems are designed for maximum energy efficiency and minimum environmental impact.
a) Storage lab
b) Data Center
c) Data warehouse
d) Fabric

View Answer

Answer: b

2. This is the process of assigning storage, usually in the form of server disk drive space, in order to optimize the performance of a storage area network.
a) Storage Provisioning
b) Data mining
c) Storage assignment
d) Data Warehousing

View Answer

Answer: a

3. Simply stated, these are large boxes that hold lots of hard disks.
a) Host
b) Tape library
c) Switch
d) Disk Array

View Answer

Answer: d

4. This consists of the precautions taken so that the effects of a disaster will be minimized.
a) Data retrieval
b) Disaster recovery
c) Archive
d) Replication

View Answer

Answer: b

5. This is the practice of collecting computer files that have been packaged together for backup, to transport to some other location, for saving away from the computer so that more hard disks can be made available, or for some other purpose.
a) Backup
b) Archive
c) Migration
d) Compression

View Answer

Answer: b

.woocommerce-message { background-color: #98C391 !important; }