Practical exercise unit 7 Topic : MapReduce program in Java Work purpose : Display function, convolution function and the assignment perform the code study


Download 398.17 Kb.
bet6/12
Sana26.03.2023
Hajmi398.17 Kb.
#1296788
1   2   3   4   5   6   7   8   9   ...   12
Bog'liq
2-deadline

HDFS is high existence
A few file of the name node in systems metadata repetition and control to do for from the secondary name node use combination information from loss protection does , but of the system high existence does not provide Name nodes still one _ failure is the point . If this at the point error face if , all customers , that's it including MapReduce jobs - Reading , writing or files the list get ability lose , because the name node is metadata and files and blocks between correspondence about a single repository of information is considered Such without , the entire Hadoop system in fact new node to work until it falls failed will be
Such in the name node in the situation failure recovery administrator file for system metadata from replicas one with new main name node to work lowers and data nodes and customers new from the node use for adjusts _
Control questions :
1. To mistakes endurance what _
2. In the network information protection to do duties counting go through
3. From the reserve the goal what _
4. In the client -server architecture network safety how provided ?
5. In the network screened of lines purpose what _
6. Network transparency what _
PRACTICAL EXERCISE ULOT-10


Topic : Hadoop data via URL read on


Work target : Hadoop URL scheme recognize to receive learning hdfs .
Hadoop file from the system the file of reading easy methods one is a Java object using data readable flow open _ java. net.URL. General idiom as follows looks like :

Hadoop URL Scheme recognize get for Java program for hdfs , you addition the work to do need _ Of this in the URL for setURLStreamHandlerFactory method call FsUrlStreamHandlerFactory example with . Method each per VM _ only one times to be called can , therefore for it usually static on the block will be done . This is a limitation that's it means that if your program another part ( eg you _ control who does not the third towards component ) call to setURLStreamHandlerFactory if he does you Hadoop data reading for this from the approach use you ca n't Next in the department alternative solution discussion will be done .
Listing 3.1 Hadoop file from systems standard to exit file printer _ program shows . (Unix cat to the command similar ).
with Hadoop together came comfortable IOUtils in the finally section from the class flow close as well access and output ¬ currents between bytes copy for we use ( System . out . our in our case ). copyBytes method last two in the argument copy transfer for used buffer volume will be held and copy transfer when completed flow to close for flag _ We enter flow ourselves that 's why we close for System . out to close need not _

Download 398.17 Kb.

Do'stlaringiz bilan baham:
1   2   3   4   5   6   7   8   9   ...   12




Ma'lumotlar bazasi mualliflik huquqi bilan himoyalangan ©fayllar.org 2024
ma'muriyatiga murojaat qiling