{"id":15627,"date":"2020-06-23T22:10:47","date_gmt":"2020-06-23T22:10:47","guid":{"rendered":"https:\/\/dbtut.com\/?p=15627"},"modified":"2020-06-23T22:11:08","modified_gmt":"2020-06-23T22:11:08","slug":"how-to-transfer-table-from-sql-server-to-hdfs-using-apache-sqoop","status":"publish","type":"post","link":"https:\/\/dbtut.com\/index.php\/2020\/06\/23\/how-to-transfer-table-from-sql-server-to-hdfs-using-apache-sqoop\/","title":{"rendered":"How to Transfer Table from SQL Server to HDFS using Apache Sqoop"},"content":{"rendered":"<p>Sometimes you may need the transfer table from SQL Server to HDFS. In this article we will examine that but first you may want to know more about Sqoop.<\/p>\n<h3>What is Apache Sqoop?<\/h3>\n<p>Apache Sqoop is an open source tool developed for data transfer between RDBMS and HDFS (Hadoop Distributed File System). The name Sqoop was formed by the abbreviation of SQL-to-Hadoop words. For detailed information: <a href=\"http:\/\/sqoop.apache.org\/\" target=\"_blank\" rel=\"noopener noreferrer\">Apache Sqoop<\/a><\/p>\n<p>In the example, we will work on the version of SQL Server 2014. As Hadoop distribution, we will use the VM image(single node cluster) prepared by Cloudera for test, management, development, demo and learning. Download link: <a href=\"https:\/\/www.cloudera.com\/downloads\/cdp-data-center-trial.html\" target=\"_blank\" rel=\"noopener noreferrer\">Cloudera QuickStart VM<\/a><\/p>\n<p>After downloading the VM image, you should definitely check the hash of the file. SHA1 for VMware image: like f6bb5abdde3e2f760711939152334296dae8d586 ..<\/p>\n<h3>Preparation<\/h3>\n<p>For testing, we will transfer any dimension or fact table we select from the AdventureWorksDW2014 database (warehouse) to HDFS. For database backup: <a href=\"https:\/\/archive.codeplex.com\/?p=msftdbprodsamples\" target=\"_blank\" rel=\"noopener noreferrer\">AdventureWorksDW2014<\/a><\/p>\n<h4>\u00a0Check Java Version<\/h4>\n<p>First of all, we need to make sure that we access the instance. Also, &#8220;<code>java -version<\/code>&#8221; command is enough to check which java version is installed in vm. The version will be required in the next steps.<\/p>\n<p id=\"zEfcyDr\"><img loading=\"lazy\" decoding=\"async\" width=\"758\" height=\"200\" class=\"size-full wp-image-15628  aligncenter\" src=\"https:\/\/dbtut.com\/wp-content\/uploads\/2020\/06\/img_5ef26f9141495.png\" alt=\"\" \/><\/p>\n<p>In this example, we see that the java version is 1.7. ~.<\/p>\n<h4>Install jdbc driver<\/h4>\n<p>After the Java check, we have to make sure that the jdbc driver that will provide the transfer is in the sqoop library. To check this, after running &#8220;<code>ls \/usr\/lib\/sqoop\/lib<\/code>&#8221; command, we need to see the file with the .jar extension that starts with the name &#8220;sqljdbc_ ~&#8221; in the output. If it is not available, we should follow the steps below.<\/p>\n<p id=\"WJPaLQm\"><img loading=\"lazy\" decoding=\"async\" width=\"718\" height=\"555\" class=\"size-full wp-image-15629  aligncenter\" src=\"https:\/\/dbtut.com\/wp-content\/uploads\/2020\/06\/img_5ef270400988f.png\" alt=\"\" \/><\/p>\n<p>We download the jdbc driver by running the command \u201cwget https:\/\/download.microsoft.com\/download\/~\/enu\/sqljdbc_~\u201d on the VM on the terminal. I do not give the full link in order not to be misleading as the driver can be updated constantly. As an example, you can review the screenshot of the command I entered. Download link: <a href=\"https:\/\/www.microsoft.com\/en-us\/download\/details.aspx?id=11774\" target=\"_blank\" rel=\"noopener noreferrer\">Microsoft JDBC Driver 6.0 for SQL Server<\/a><\/p>\n<p id=\"yVTdnOh\"><img loading=\"lazy\" decoding=\"async\" class=\" wp-image-15630  aligncenter\" src=\"https:\/\/dbtut.com\/wp-content\/uploads\/2020\/06\/img_5ef270be42fda.png\" alt=\"\" width=\"807\" height=\"377\" \/><\/p>\n<p>After downloading the file, we need to extract it from zip and copy it to the relevant directory. For this process, it is enough to run &#8220;<code>tar zxvf sqljdbc~.tar.gz<\/code>&#8221; command in the terminal. After extracting the zip, we need to copy the relevant jar file under the sqoop library path, depending on the java version we learned above. For this, we run the command &#8220;<code>sudo cp -p sqljdbc_6.0\/enu\/jre7\/sqljdbc41.jar \/usr\/lib\/sqoop\/lib\/<\/code>&#8220;.<\/p>\n<p id=\"bvtpbGU\"><img loading=\"lazy\" decoding=\"async\" class=\" wp-image-15631  aligncenter\" src=\"https:\/\/dbtut.com\/wp-content\/uploads\/2020\/06\/img_5ef274c44b6f3.png\" alt=\"\" width=\"738\" height=\"596\" \/><\/p>\n<h3>Transfer Table from SQL Server to HDFS<\/h3>\n<p>After completing the processes above, we can transfer the data. In order to perform data transfer, we configure and run the following commands according to the environment we have. For example, &#8220;sqlserver:\/\/dev-mssql\\sql2014&#8221; refers to the server name where the SQL Server instance is installed, and the SQL2014 instance name. IP address can be used instead of server name.<\/p>\n<pre class=\"lang:default decode:true \">sqoop import \u2013connect \u2018jdbc:sqlserver:\/\/dev-mssql\\sql2014;database=AdventureWorksDW2014\u2019 \u2013username \u2018hadoop\u2019 -P \u2013table DimReseller<\/pre>\n<p><strong>sqoop import:<\/strong> The command to transfer the table or view in RDBMS to HDFS.<\/p>\n<p><strong>\u2013Connect:<\/strong> Parameter used to access RDBMS like SQL Server, MySQL, Oracle<\/p>\n<p><strong>\u2013Jdbc: sqlserver<\/strong>: Driver to be used to provide access to RDBMS<\/p>\n<p><strong>\u2013Username: &#8216;hadoop&#8217;:<\/strong> login name to access RDBMS<\/p>\n<p id=\"uQjgAfX\"><img loading=\"lazy\" decoding=\"async\" class=\" wp-image-15632  aligncenter\" src=\"https:\/\/dbtut.com\/wp-content\/uploads\/2020\/06\/img_5ef27677b7c2a.png\" alt=\"\" width=\"879\" height=\"548\" \/><\/p>\n<p>After running the command, if there is a primary key in the table, sqoop will find the MIN and MAX values \u200b\u200bfor us according to the primary key and determine the amount of data to be transferred at once according to the total number of rows. In this example, the \u201cSplit size:175\u201d and \u201cmapreduce.JobSubmitter: number of splits:4\u201d sections expresses this.<\/p>\n<p>The last message shows the total number of records transferred. The screenshot below shows that 701 records have been transferred.<\/p>\n<p>The above &#8220;number of splits: 4&#8221; value shows that 701 records will be divided into 4 parts and transferred in blocks. This will be 175\u00d74:~701.<\/p>\n<p id=\"hNXLhiM\"><img loading=\"lazy\" decoding=\"async\" class=\" wp-image-15633  aligncenter\" src=\"https:\/\/dbtut.com\/wp-content\/uploads\/2020\/06\/img_5ef278771d122.png\" alt=\"\" width=\"841\" height=\"114\" \/><\/p>\n<p>Let&#8217;s also check the number of DimReseller records in AdventureWorksDW2014.<\/p>\n<p id=\"jzegYYB\"><img loading=\"lazy\" decoding=\"async\" width=\"733\" height=\"291\" class=\"size-full wp-image-15634  aligncenter\" src=\"https:\/\/dbtut.com\/wp-content\/uploads\/2020\/06\/img_5ef278a7e28be.png\" alt=\"\" \/><\/p>\n<p>After checking the number of records, you can run the command &#8220;hdfs dfs -ls .\/DimReseller&#8221;, or access the HDFS explorer from the Cloudera QuickStart interface, to check that the table is included as a file in HDFS.<\/p>\n<p id=\"fcTssSp\"><img loading=\"lazy\" decoding=\"async\" class=\" wp-image-15635  aligncenter\" src=\"https:\/\/dbtut.com\/wp-content\/uploads\/2020\/06\/img_5ef27913634d6.png\" alt=\"\" width=\"847\" height=\"237\" \/><\/p>\n<p id=\"uWfAWwP\"><img loading=\"lazy\" decoding=\"async\" class=\" wp-image-15636  aligncenter\" src=\"https:\/\/dbtut.com\/wp-content\/uploads\/2020\/06\/img_5ef27938ca212.png\" alt=\"\" width=\"939\" height=\"495\" \/><\/p>\n<p id=\"oYYMNaO\"><img loading=\"lazy\" decoding=\"async\" class=\" wp-image-15637  aligncenter\" src=\"https:\/\/dbtut.com\/wp-content\/uploads\/2020\/06\/img_5ef2795c4aed2.png\" alt=\"\" width=\"915\" height=\"563\" \/><\/p>\n<p>We are running the command &#8220;hdfs dfs -cat DimReseller\/part-m-00000|head&#8221; in order to check the content of any part, and to list TOP 10 records, in the dimreseller which is in parts.<\/p>\n<p id=\"vUMKNux\"><img loading=\"lazy\" decoding=\"async\" class=\" wp-image-15638  aligncenter\" src=\"https:\/\/dbtut.com\/wp-content\/uploads\/2020\/06\/img_5ef27ab71d858.png\" alt=\"\" width=\"910\" height=\"383\" \/><\/p>\n<p id=\"zLFoQGe\"><img loading=\"lazy\" decoding=\"async\" class=\" wp-image-15639  aligncenter\" src=\"https:\/\/dbtut.com\/wp-content\/uploads\/2020\/06\/img_5ef27ad43a4b6.png\" alt=\"\" width=\"888\" height=\"413\" \/><\/p>\n<p>As can be seen in the screenshots, the table has been successfully transferred. After this stage, DimReseller is ready for data preparation processes for querying or batch processing.<\/p>\n<div class=\"pvc_clear\"><\/div>\n<p id=\"pvc_stats_15627\" class=\"pvc_stats all  \" data-element-id=\"15627\" style=\"\"><i class=\"pvc-stats-icon medium\" aria-hidden=\"true\"><svg aria-hidden=\"true\" focusable=\"false\" data-prefix=\"far\" data-icon=\"chart-bar\" role=\"img\" xmlns=\"http:\/\/www.w3.org\/2000\/svg\" viewBox=\"0 0 512 512\" class=\"svg-inline--fa fa-chart-bar fa-w-16 fa-2x\"><path fill=\"currentColor\" d=\"M396.8 352h22.4c6.4 0 12.8-6.4 12.8-12.8V108.8c0-6.4-6.4-12.8-12.8-12.8h-22.4c-6.4 0-12.8 6.4-12.8 12.8v230.4c0 6.4 6.4 12.8 12.8 12.8zm-192 0h22.4c6.4 0 12.8-6.4 12.8-12.8V140.8c0-6.4-6.4-12.8-12.8-12.8h-22.4c-6.4 0-12.8 6.4-12.8 12.8v198.4c0 6.4 6.4 12.8 12.8 12.8zm96 0h22.4c6.4 0 12.8-6.4 12.8-12.8V204.8c0-6.4-6.4-12.8-12.8-12.8h-22.4c-6.4 0-12.8 6.4-12.8 12.8v134.4c0 6.4 6.4 12.8 12.8 12.8zM496 400H48V80c0-8.84-7.16-16-16-16H16C7.16 64 0 71.16 0 80v336c0 17.67 14.33 32 32 32h464c8.84 0 16-7.16 16-16v-16c0-8.84-7.16-16-16-16zm-387.2-48h22.4c6.4 0 12.8-6.4 12.8-12.8v-70.4c0-6.4-6.4-12.8-12.8-12.8h-22.4c-6.4 0-12.8 6.4-12.8 12.8v70.4c0 6.4 6.4 12.8 12.8 12.8z\" class=\"\"><\/path><\/svg><\/i> <img loading=\"lazy\" decoding=\"async\" width=\"16\" height=\"16\" alt=\"Loading\" src=\"https:\/\/dbtut.com\/wp-content\/plugins\/page-views-count\/ajax-loader-2x.gif\" border=0 \/><\/p>\n<div class=\"pvc_clear\"><\/div>\n","protected":false},"excerpt":{"rendered":"<p>Sometimes you may need the transfer table from SQL Server to HDFS. In this article we will examine that but first you may want to know more about Sqoop. What is Apache Sqoop? Apache Sqoop is an open source tool developed for data transfer between RDBMS and HDFS (Hadoop Distributed File System). The name Sqoop &hellip;<\/p>\n<div class=\"pvc_clear\"><\/div>\n<p id=\"pvc_stats_15627\" class=\"pvc_stats all  \" data-element-id=\"15627\" style=\"\"><i class=\"pvc-stats-icon medium\" aria-hidden=\"true\"><svg aria-hidden=\"true\" focusable=\"false\" data-prefix=\"far\" data-icon=\"chart-bar\" role=\"img\" xmlns=\"http:\/\/www.w3.org\/2000\/svg\" viewBox=\"0 0 512 512\" class=\"svg-inline--fa fa-chart-bar fa-w-16 fa-2x\"><path fill=\"currentColor\" d=\"M396.8 352h22.4c6.4 0 12.8-6.4 12.8-12.8V108.8c0-6.4-6.4-12.8-12.8-12.8h-22.4c-6.4 0-12.8 6.4-12.8 12.8v230.4c0 6.4 6.4 12.8 12.8 12.8zm-192 0h22.4c6.4 0 12.8-6.4 12.8-12.8V140.8c0-6.4-6.4-12.8-12.8-12.8h-22.4c-6.4 0-12.8 6.4-12.8 12.8v198.4c0 6.4 6.4 12.8 12.8 12.8zm96 0h22.4c6.4 0 12.8-6.4 12.8-12.8V204.8c0-6.4-6.4-12.8-12.8-12.8h-22.4c-6.4 0-12.8 6.4-12.8 12.8v134.4c0 6.4 6.4 12.8 12.8 12.8zM496 400H48V80c0-8.84-7.16-16-16-16H16C7.16 64 0 71.16 0 80v336c0 17.67 14.33 32 32 32h464c8.84 0 16-7.16 16-16v-16c0-8.84-7.16-16-16-16zm-387.2-48h22.4c6.4 0 12.8-6.4 12.8-12.8v-70.4c0-6.4-6.4-12.8-12.8-12.8h-22.4c-6.4 0-12.8 6.4-12.8 12.8v70.4c0 6.4 6.4 12.8 12.8 12.8z\" class=\"\"><\/path><\/svg><\/i> <img loading=\"lazy\" decoding=\"async\" width=\"16\" height=\"16\" alt=\"Loading\" src=\"https:\/\/dbtut.com\/wp-content\/plugins\/page-views-count\/ajax-loader-2x.gif\" border=0 \/><\/p>\n<div class=\"pvc_clear\"><\/div>\n","protected":false},"author":488,"featured_media":15640,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"om_disable_all_campaigns":false,"_monsterinsights_skip_tracking":false,"_monsterinsights_sitenote_active":false,"_monsterinsights_sitenote_note":"","_monsterinsights_sitenote_category":0,"_uf_show_specific_survey":0,"_uf_disable_surveys":false,"footnotes":""},"categories":[1324,3],"tags":[9691,9692,9690,9693,9689,9687,9688,9685,9684,9686],"class_list":["post-15627","post","type-post","status-publish","format-standard","has-post-thumbnail","","category-big-data","category-mssql","tag-migrate-data-from-sql-server-to-hadoop","tag-move-data-from-sql-server-to-hadoop","tag-moving-data-from-sql-server-to-hadoop","tag-transfer-data-from-rdbms-to-hadoop","tag-transfer-data-from-sql-server-to-hadoop","tag-transfer-data-from-sql-to-hadoop","tag-transfer-data-from-sql-to-hdfs","tag-transfer-table-from-rdbms-to-hdfs","tag-transfer-table-from-sql-server-to-hdfs","tag-transfer-table-to-hadoop"],"aioseo_notices":[],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v24.9 - https:\/\/yoast.com\/wordpress\/plugins\/seo\/ -->\n<title>How to Transfer Table from SQL Server to HDFS using Apache Sqoop - Database Tutorials<\/title>\n<meta name=\"description\" content=\"How to Transfer Table from SQL Server to HDFS using Apache Sqoop\" \/>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/dbtut.com\/index.php\/2020\/06\/23\/how-to-transfer-table-from-sql-server-to-hdfs-using-apache-sqoop\/\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"How to Transfer Table from SQL Server to HDFS using Apache Sqoop - Database Tutorials\" \/>\n<meta property=\"og:description\" content=\"How to Transfer Table from SQL Server to HDFS using Apache Sqoop\" \/>\n<meta property=\"og:url\" content=\"https:\/\/dbtut.com\/index.php\/2020\/06\/23\/how-to-transfer-table-from-sql-server-to-hdfs-using-apache-sqoop\/\" \/>\n<meta property=\"og:site_name\" content=\"Database Tutorials\" \/>\n<meta property=\"article:published_time\" content=\"2020-06-23T22:10:47+00:00\" \/>\n<meta property=\"article:modified_time\" content=\"2020-06-23T22:11:08+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/dbtut.com\/wp-content\/uploads\/2020\/06\/Ads\u0131z-6.png\" \/>\n\t<meta property=\"og:image:width\" content=\"904\" \/>\n\t<meta property=\"og:image:height\" content=\"483\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/png\" \/>\n<meta name=\"author\" content=\"Emrah Erdo\u011fan\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:label1\" content=\"Written by\" \/>\n\t<meta name=\"twitter:data1\" content=\"Emrah Erdo\u011fan\" \/>\n\t<meta name=\"twitter:label2\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data2\" content=\"4 minutes\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\/\/schema.org\",\"@graph\":[{\"@type\":\"Article\",\"@id\":\"https:\/\/dbtut.com\/index.php\/2020\/06\/23\/how-to-transfer-table-from-sql-server-to-hdfs-using-apache-sqoop\/#article\",\"isPartOf\":{\"@id\":\"https:\/\/dbtut.com\/index.php\/2020\/06\/23\/how-to-transfer-table-from-sql-server-to-hdfs-using-apache-sqoop\/\"},\"author\":{\"name\":\"Emrah Erdo\u011fan\",\"@id\":\"https:\/\/dbtut.com\/#\/schema\/person\/8f8a36bd455de4bb3f6db0427bbde1ab\"},\"headline\":\"How to Transfer Table from SQL Server to HDFS using Apache Sqoop\",\"datePublished\":\"2020-06-23T22:10:47+00:00\",\"dateModified\":\"2020-06-23T22:11:08+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\/\/dbtut.com\/index.php\/2020\/06\/23\/how-to-transfer-table-from-sql-server-to-hdfs-using-apache-sqoop\/\"},\"wordCount\":722,\"commentCount\":0,\"publisher\":{\"@id\":\"https:\/\/dbtut.com\/#organization\"},\"image\":{\"@id\":\"https:\/\/dbtut.com\/index.php\/2020\/06\/23\/how-to-transfer-table-from-sql-server-to-hdfs-using-apache-sqoop\/#primaryimage\"},\"thumbnailUrl\":\"https:\/\/dbtut.com\/wp-content\/uploads\/2020\/06\/Ads\u0131z-6.png\",\"keywords\":[\"migrate data from sql server to hadoop\",\"move data from sql server to hadoop\",\"moving data from sql server to hadoop\",\"transfer data from rdbms to hadoop\",\"transfer data from sql server to hadoop\",\"transfer data from sql to hadoop\",\"transfer data from sql to hdfs\",\"Transfer Table from rdbms to HDFS\",\"Transfer Table from SQL Server to HDFS\",\"transfer table to hadoop\"],\"articleSection\":[\"Big Data\",\"MSSQL\"],\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"CommentAction\",\"name\":\"Comment\",\"target\":[\"https:\/\/dbtut.com\/index.php\/2020\/06\/23\/how-to-transfer-table-from-sql-server-to-hdfs-using-apache-sqoop\/#respond\"]}]},{\"@type\":\"WebPage\",\"@id\":\"https:\/\/dbtut.com\/index.php\/2020\/06\/23\/how-to-transfer-table-from-sql-server-to-hdfs-using-apache-sqoop\/\",\"url\":\"https:\/\/dbtut.com\/index.php\/2020\/06\/23\/how-to-transfer-table-from-sql-server-to-hdfs-using-apache-sqoop\/\",\"name\":\"How to Transfer Table from SQL Server to HDFS using Apache Sqoop - Database Tutorials\",\"isPartOf\":{\"@id\":\"https:\/\/dbtut.com\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\/\/dbtut.com\/index.php\/2020\/06\/23\/how-to-transfer-table-from-sql-server-to-hdfs-using-apache-sqoop\/#primaryimage\"},\"image\":{\"@id\":\"https:\/\/dbtut.com\/index.php\/2020\/06\/23\/how-to-transfer-table-from-sql-server-to-hdfs-using-apache-sqoop\/#primaryimage\"},\"thumbnailUrl\":\"https:\/\/dbtut.com\/wp-content\/uploads\/2020\/06\/Ads\u0131z-6.png\",\"datePublished\":\"2020-06-23T22:10:47+00:00\",\"dateModified\":\"2020-06-23T22:11:08+00:00\",\"description\":\"How to Transfer Table from SQL Server to HDFS using Apache Sqoop\",\"breadcrumb\":{\"@id\":\"https:\/\/dbtut.com\/index.php\/2020\/06\/23\/how-to-transfer-table-from-sql-server-to-hdfs-using-apache-sqoop\/#breadcrumb\"},\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\/\/dbtut.com\/index.php\/2020\/06\/23\/how-to-transfer-table-from-sql-server-to-hdfs-using-apache-sqoop\/\"]}]},{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/dbtut.com\/index.php\/2020\/06\/23\/how-to-transfer-table-from-sql-server-to-hdfs-using-apache-sqoop\/#primaryimage\",\"url\":\"https:\/\/dbtut.com\/wp-content\/uploads\/2020\/06\/Ads\u0131z-6.png\",\"contentUrl\":\"https:\/\/dbtut.com\/wp-content\/uploads\/2020\/06\/Ads\u0131z-6.png\",\"width\":904,\"height\":483},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\/\/dbtut.com\/index.php\/2020\/06\/23\/how-to-transfer-table-from-sql-server-to-hdfs-using-apache-sqoop\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\/\/dbtut.com\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"How to Transfer Table from SQL Server to HDFS using Apache Sqoop\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\/\/dbtut.com\/#website\",\"url\":\"https:\/\/dbtut.com\/\",\"name\":\"Database Tutorials\",\"description\":\"MSSQL, Oracle, PostgreSQL, MySQL, MariaDB, DB2, Sybase, Teradata, Big Data, NOSQL, MongoDB, Couchbase, Cassandra, Windows, Linux\",\"publisher\":{\"@id\":\"https:\/\/dbtut.com\/#organization\"},\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\/\/dbtut.com\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"en-US\"},{\"@type\":\"Organization\",\"@id\":\"https:\/\/dbtut.com\/#organization\",\"name\":\"dbtut\",\"url\":\"https:\/\/dbtut.com\/\",\"logo\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/dbtut.com\/#\/schema\/logo\/image\/\",\"url\":\"https:\/\/dbtut.com\/wp-content\/uploads\/2021\/02\/dbtutlogo.jpg\",\"contentUrl\":\"https:\/\/dbtut.com\/wp-content\/uploads\/2021\/02\/dbtutlogo.jpg\",\"width\":223,\"height\":36,\"caption\":\"dbtut\"},\"image\":{\"@id\":\"https:\/\/dbtut.com\/#\/schema\/logo\/image\/\"}},{\"@type\":\"Person\",\"@id\":\"https:\/\/dbtut.com\/#\/schema\/person\/8f8a36bd455de4bb3f6db0427bbde1ab\",\"name\":\"Emrah Erdo\u011fan\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/dbtut.com\/#\/schema\/person\/image\/\",\"url\":\"https:\/\/secure.gravatar.com\/avatar\/a23ad74b2dfb1a2f7ca3216abca47df06e0c4ba93d074f56c6efe140ed8a8fa5?s=96&d=mm&r=g\",\"contentUrl\":\"https:\/\/secure.gravatar.com\/avatar\/a23ad74b2dfb1a2f7ca3216abca47df06e0c4ba93d074f56c6efe140ed8a8fa5?s=96&d=mm&r=g\",\"caption\":\"Emrah Erdo\u011fan\"},\"url\":\"https:\/\/dbtut.com\/index.php\/author\/emraherdogan\/\"}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"How to Transfer Table from SQL Server to HDFS using Apache Sqoop - Database Tutorials","description":"How to Transfer Table from SQL Server to HDFS using Apache Sqoop","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/dbtut.com\/index.php\/2020\/06\/23\/how-to-transfer-table-from-sql-server-to-hdfs-using-apache-sqoop\/","og_locale":"en_US","og_type":"article","og_title":"How to Transfer Table from SQL Server to HDFS using Apache Sqoop - Database Tutorials","og_description":"How to Transfer Table from SQL Server to HDFS using Apache Sqoop","og_url":"https:\/\/dbtut.com\/index.php\/2020\/06\/23\/how-to-transfer-table-from-sql-server-to-hdfs-using-apache-sqoop\/","og_site_name":"Database Tutorials","article_published_time":"2020-06-23T22:10:47+00:00","article_modified_time":"2020-06-23T22:11:08+00:00","og_image":[{"width":904,"height":483,"url":"https:\/\/dbtut.com\/wp-content\/uploads\/2020\/06\/Ads\u0131z-6.png","type":"image\/png"}],"author":"Emrah Erdo\u011fan","twitter_card":"summary_large_image","twitter_misc":{"Written by":"Emrah Erdo\u011fan","Est. reading time":"4 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"Article","@id":"https:\/\/dbtut.com\/index.php\/2020\/06\/23\/how-to-transfer-table-from-sql-server-to-hdfs-using-apache-sqoop\/#article","isPartOf":{"@id":"https:\/\/dbtut.com\/index.php\/2020\/06\/23\/how-to-transfer-table-from-sql-server-to-hdfs-using-apache-sqoop\/"},"author":{"name":"Emrah Erdo\u011fan","@id":"https:\/\/dbtut.com\/#\/schema\/person\/8f8a36bd455de4bb3f6db0427bbde1ab"},"headline":"How to Transfer Table from SQL Server to HDFS using Apache Sqoop","datePublished":"2020-06-23T22:10:47+00:00","dateModified":"2020-06-23T22:11:08+00:00","mainEntityOfPage":{"@id":"https:\/\/dbtut.com\/index.php\/2020\/06\/23\/how-to-transfer-table-from-sql-server-to-hdfs-using-apache-sqoop\/"},"wordCount":722,"commentCount":0,"publisher":{"@id":"https:\/\/dbtut.com\/#organization"},"image":{"@id":"https:\/\/dbtut.com\/index.php\/2020\/06\/23\/how-to-transfer-table-from-sql-server-to-hdfs-using-apache-sqoop\/#primaryimage"},"thumbnailUrl":"https:\/\/dbtut.com\/wp-content\/uploads\/2020\/06\/Ads\u0131z-6.png","keywords":["migrate data from sql server to hadoop","move data from sql server to hadoop","moving data from sql server to hadoop","transfer data from rdbms to hadoop","transfer data from sql server to hadoop","transfer data from sql to hadoop","transfer data from sql to hdfs","Transfer Table from rdbms to HDFS","Transfer Table from SQL Server to HDFS","transfer table to hadoop"],"articleSection":["Big Data","MSSQL"],"inLanguage":"en-US","potentialAction":[{"@type":"CommentAction","name":"Comment","target":["https:\/\/dbtut.com\/index.php\/2020\/06\/23\/how-to-transfer-table-from-sql-server-to-hdfs-using-apache-sqoop\/#respond"]}]},{"@type":"WebPage","@id":"https:\/\/dbtut.com\/index.php\/2020\/06\/23\/how-to-transfer-table-from-sql-server-to-hdfs-using-apache-sqoop\/","url":"https:\/\/dbtut.com\/index.php\/2020\/06\/23\/how-to-transfer-table-from-sql-server-to-hdfs-using-apache-sqoop\/","name":"How to Transfer Table from SQL Server to HDFS using Apache Sqoop - Database Tutorials","isPartOf":{"@id":"https:\/\/dbtut.com\/#website"},"primaryImageOfPage":{"@id":"https:\/\/dbtut.com\/index.php\/2020\/06\/23\/how-to-transfer-table-from-sql-server-to-hdfs-using-apache-sqoop\/#primaryimage"},"image":{"@id":"https:\/\/dbtut.com\/index.php\/2020\/06\/23\/how-to-transfer-table-from-sql-server-to-hdfs-using-apache-sqoop\/#primaryimage"},"thumbnailUrl":"https:\/\/dbtut.com\/wp-content\/uploads\/2020\/06\/Ads\u0131z-6.png","datePublished":"2020-06-23T22:10:47+00:00","dateModified":"2020-06-23T22:11:08+00:00","description":"How to Transfer Table from SQL Server to HDFS using Apache Sqoop","breadcrumb":{"@id":"https:\/\/dbtut.com\/index.php\/2020\/06\/23\/how-to-transfer-table-from-sql-server-to-hdfs-using-apache-sqoop\/#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/dbtut.com\/index.php\/2020\/06\/23\/how-to-transfer-table-from-sql-server-to-hdfs-using-apache-sqoop\/"]}]},{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/dbtut.com\/index.php\/2020\/06\/23\/how-to-transfer-table-from-sql-server-to-hdfs-using-apache-sqoop\/#primaryimage","url":"https:\/\/dbtut.com\/wp-content\/uploads\/2020\/06\/Ads\u0131z-6.png","contentUrl":"https:\/\/dbtut.com\/wp-content\/uploads\/2020\/06\/Ads\u0131z-6.png","width":904,"height":483},{"@type":"BreadcrumbList","@id":"https:\/\/dbtut.com\/index.php\/2020\/06\/23\/how-to-transfer-table-from-sql-server-to-hdfs-using-apache-sqoop\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/dbtut.com\/"},{"@type":"ListItem","position":2,"name":"How to Transfer Table from SQL Server to HDFS using Apache Sqoop"}]},{"@type":"WebSite","@id":"https:\/\/dbtut.com\/#website","url":"https:\/\/dbtut.com\/","name":"Database Tutorials","description":"MSSQL, Oracle, PostgreSQL, MySQL, MariaDB, DB2, Sybase, Teradata, Big Data, NOSQL, MongoDB, Couchbase, Cassandra, Windows, Linux","publisher":{"@id":"https:\/\/dbtut.com\/#organization"},"potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/dbtut.com\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"en-US"},{"@type":"Organization","@id":"https:\/\/dbtut.com\/#organization","name":"dbtut","url":"https:\/\/dbtut.com\/","logo":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/dbtut.com\/#\/schema\/logo\/image\/","url":"https:\/\/dbtut.com\/wp-content\/uploads\/2021\/02\/dbtutlogo.jpg","contentUrl":"https:\/\/dbtut.com\/wp-content\/uploads\/2021\/02\/dbtutlogo.jpg","width":223,"height":36,"caption":"dbtut"},"image":{"@id":"https:\/\/dbtut.com\/#\/schema\/logo\/image\/"}},{"@type":"Person","@id":"https:\/\/dbtut.com\/#\/schema\/person\/8f8a36bd455de4bb3f6db0427bbde1ab","name":"Emrah Erdo\u011fan","image":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/dbtut.com\/#\/schema\/person\/image\/","url":"https:\/\/secure.gravatar.com\/avatar\/a23ad74b2dfb1a2f7ca3216abca47df06e0c4ba93d074f56c6efe140ed8a8fa5?s=96&d=mm&r=g","contentUrl":"https:\/\/secure.gravatar.com\/avatar\/a23ad74b2dfb1a2f7ca3216abca47df06e0c4ba93d074f56c6efe140ed8a8fa5?s=96&d=mm&r=g","caption":"Emrah Erdo\u011fan"},"url":"https:\/\/dbtut.com\/index.php\/author\/emraherdogan\/"}]}},"amp_enabled":true,"_links":{"self":[{"href":"https:\/\/dbtut.com\/index.php\/wp-json\/wp\/v2\/posts\/15627","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/dbtut.com\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/dbtut.com\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/dbtut.com\/index.php\/wp-json\/wp\/v2\/users\/488"}],"replies":[{"embeddable":true,"href":"https:\/\/dbtut.com\/index.php\/wp-json\/wp\/v2\/comments?post=15627"}],"version-history":[{"count":0,"href":"https:\/\/dbtut.com\/index.php\/wp-json\/wp\/v2\/posts\/15627\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/dbtut.com\/index.php\/wp-json\/wp\/v2\/media\/15640"}],"wp:attachment":[{"href":"https:\/\/dbtut.com\/index.php\/wp-json\/wp\/v2\/media?parent=15627"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/dbtut.com\/index.php\/wp-json\/wp\/v2\/categories?post=15627"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/dbtut.com\/index.php\/wp-json\/wp\/v2\/tags?post=15627"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}