<?xml version='1.0' encoding='UTF-8'?>
<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Publishing DTD v1.1d1 20130915//EN" "JATS-journalpublishing1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta id="journal-meta-c9612a387f7743d5a054be7cfffc3a11">
      <journal-id journal-id-type="nlm-ta">Sciresol</journal-id>
      <journal-id journal-id-type="publisher-id">Sciresol</journal-id>
      <journal-id journal-id-type="journal_submission_guidelines">http://ugit.net/publication_fsjoaj3qdho/geoeye_cm_ts9ypx7s/</journal-id>
      <journal-title-group>
        <journal-title>Geo-Eye</journal-title>
      </journal-title-group>
      <issn publication-format="electronic">XXXX-XXXX</issn>
      <issn publication-format="print"/>
    </journal-meta>
    <article-meta id="article-meta-2b138434a3184fc995d55785db26a327">
      <article-id pub-id-type="doi">10.53989/bu.ge.v14.i1.24.14</article-id>
      <article-categories>
        <subj-group>
          <subject>ORIGINAL ARTICLE</subject>
        </subj-group>
      </article-categories>
      <title-group>
        <article-title id="article-title-917b9512c0af4dec8eaf554593dcc6eb">
          <bold id="strong-facbf6f2a8d04caeb86620d0970e158b">Precision Land Use and Land Cover Classification using Google Earth Engine: Integrating Random Forest and Support Vector Machine </bold>
          <bold id="strong-67b67502747c445bb472b487584c45d6">Algorithms</bold>
        </article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author" corresp="yes">
          <name id="name-5a7d44f7261e4773ace8f458a84832c7">
            <surname>Sultana</surname>
            <given-names>Salma</given-names>
          </name>
          <email>salmas216@gmail.com</email>
          <xref id="x-ee589e252e78" rid="aff-542b3157951249e0849b43fc984b3d84" ref-type="aff">1</xref>
        </contrib>
        <contrib contrib-type="author">
          <name id="name-5b3e9d4eced346878dc45233fe613104">
            <surname>Inayathulla</surname>
            <given-names>M</given-names>
          </name>
          <xref id="xref-38b487489a8c41bc989a356c5549d2af" rid="aff-542b3157951249e0849b43fc984b3d84" ref-type="aff">1</xref>
        </contrib>
        <aff id="aff-542b3157951249e0849b43fc984b3d84">
          <institution>Department of Civil Engineering, Bangalore University, Jnana Bharathi</institution>
          <addr-line>Bengaluru, Karnataka </addr-line>
          <country country="IN">India</country>
        </aff>
      </contrib-group>
      <pub-date date-type="pub">
        <day>20</day>
        <month>10</month>
        <year>2025</year>
      </pub-date>
      <volume>14</volume>
      <issue>1</issue>
      <fpage>54</fpage>
      <history>
        <date date-type="received">
          <day>11</day>
          <month>7</month>
          <year>2024</year>
        </date>
        <date date-type="accepted">
          <day>10</day>
          <month>10</month>
          <year>2025</year>
        </date>
      </history>
      <permissions>
        <copyright-year>2025</copyright-year>
      </permissions>
      <abstract id="abstract-abstract-title-da7b8702413240439d9d9e768c597fb3">
        <title id="abstract-title-da7b8702413240439d9d9e768c597fb3">Abstract</title>
        <p id="paragraph-40d762b7d27446b7af12bce1a608a9a1">Land Use and Land Cover (LULC) classification is crucial for understanding and managing environmental resources. This study introduces an innovative methodology that leverages Sentinel satellite data alongside two robust machine learning algorithms, Random Forest (RF) and Support Vector Machine (SVM), on the Google Earth Engine (GEE) platform. Renowned for its high-resolution multispectral imagery, Sentinel data offer rich information for classification. GEE provides access to extensive geospatial datasets and computational resources, enabling effective analysis. RF and SVM are known for their ability to handle complex datasets, optimizing classification accuracy. The study outlines a systematic workflow for preprocessing Sentinel imagery, followed by the implementation of RF and SVM algorithms, with a focus on accurately classifying vegetation, built-up areas, barren land, and water bodies. Evaluation metrics, including overall accuracy and kappa coefficient, demonstrate the efficacy of the proposed methodology. This compelling study highlights the utility of RF and SVM within GEE for precise LULC mapping, emphasizing their pivotal role in supporting informed decision-making for environmental planning and conservation initiatives.</p>
      </abstract>
      <kwd-group id="kwd-group-396aaa0ca645440aac9377c031d6d253">
        <title>Keywords</title>
        <kwd>Google Earth Engine</kwd>
        <kwd>Support Vector machine</kwd>
        <kwd>Random Forest</kwd>
        <kwd>Sentinel satellite data</kwd>
        <kwd>Land Use and Land Cover</kwd>
      </kwd-group>
      <funding-group>
        <funding-statement>None</funding-statement>
      </funding-group>
    </article-meta>
  </front>
  <body>
    <sec>
      <title id="title-3bd47eb0b17f4cde87c03c3ff533a443">1 Introduction</title>
      <p id="paragraph-16e95b82f547448996e885517deb5b08">LULC Classification plays an important role in understanding the dynamic interaction between the activities which are done by human being for the betterment of day to day life and the environment. It has become the fundamental source for accessing changes in the earth’s surface. Understanding LULC classification provides insights into ecosystem health, habitat fragmentation, urbanization patterns, and climate change. Remote sensing techniques along with advanced classification algorithms are essential for accurate LULC mapping. These classifications provide a valuable input for land-use planning, agricultural management, water resource assessment, and policy formulation at local, regional, and global scales. Monitoring and continuously updating of LULC datasets are for tracking transformation in landscapes <xref id="xref-261074f6e38f4f3d806db70a2fddf52d" rid="R284032534135483" ref-type="bibr">1</xref>.﻿</p>
      <p id="paragraph-f91180a736064687aa8e141c7d04e835">Sentinel-2 offers high-resolution multispectral data, enabling detailed and accurate classification of LULC categories. The spectral bands provided by Sentinel-2, including visible, near-infrared, and short-wave infrared, facilitate discrimination between various land cover types based on their unique spectral signatures <xref id="xref-1154d597799d4be9a2b1c78ff2ef04e5" rid="R284032534135491" ref-type="bibr">2</xref>. The Sentinal-2 satellite image used need to undergo pre-processing steps such as radiometric calibration, geometric rectification, and atmospheric correction are essential to ensure the quality and accuracy of LULC classification <xref id="xref-bbdc770b848648a297c69e2089cfabe1" rid="R284032534135488" ref-type="bibr">3</xref>. Monitoring Sentinel-2 imagery continuously helps for temporal analysis of change in LULC over time, supporting making better decisions for land management, environmental monitoring, and climate change studies. Sentinal-2 satellite image has the ability to capture seasonal variations and changes in vegetation health, which is very essential for the identification of agricultural land use, forest covers, change in urban areas, water bodies and other type of land cover <xref id="xref-3db21c5f1b374f7cbb58721cf12d9099" rid="R284032534135482" ref-type="bibr">4</xref>. Accuracy of LULC classification can be enhanced through the integration of Sentinal-2 satellite image with digital elevation models (DEM), land use maps, and field surveys. The availability of cloud-based processing platform like GEE and open-access nature of sentinel-2 imagery make LULC classification cost-effective for a wide range of applications.</p>
      <p id="paragraph-06d299838faf4ea4871ccc8890c938b9">GEE is a powerful platform for conducting LULC analyses using a vast archive of satellite imagery. It offers access to a wide range collection of remote sensing datasets, such as Sentinal-2, Landsat, MODIs, and more, covering a wide range of spatial and temporal resolutions suitable for classification of LULC. JavaScript-based programming interface on GEE platform enables user to develop scripts for LULC classification, offering flexibility and customization options to suit specific research or application needs. Collaborative features present on GEE platform enable users to develop custom scripts to share code, datasets, and results, fostering collaboration and knowledge exchange in the field of LULC analysis. The scalability, accessibility and rich feature set of GEE make it a valuable tool for conducting LULC research, monitoring environmental changes, and supporting decision making processes <xref id="xref-92979b95f1bc4cf29cf3de2eeec862ff" rid="R284032534135487" ref-type="bibr">5</xref>.</p>
      <p id="paragraph-26effbf4161d44828daacebe399e7caf">RF is a method that combines the prediction of multiple decision trees, making it robust against over fitting and capable of handling complex tasks such as LULC classification. SVM comes under the category of supervised learning algorithm that works well for LULC classification by finding the optimal hyperplane that separates different classes in the feature space. GEE provides built in implementation of RF and SVM algorithms, allowing users to apply these techniques easily for LULC classification using satellite imagery <xref id="xref-1d023b7aa523449991f802cc715457be" rid="R284032534135490" ref-type="bibr">6</xref>. JavaScript API on GEE enables users to customize RF and SVM classification workflows, such as parameter tuning and feature selection, thereby optimizing classification accuracy for LULC mapping objectives. Leveraging advanced machine learning algorithms such as RF and SVM within that GEE platform has become popular to perform the LULC analysis accurately and efficiently <xref id="xref-be9539dcb1754e5fb03447f26e6a5b1f" rid="R284032534135481" ref-type="bibr">7</xref>. The study done in this paper explores the application of RF and SVM which is used for the classification of LULC on GEE platform. </p>
    </sec>
    <sec>
      <title id="title-8c9736655f274536803dfefb7aed16e1">2 Materials and Methods</title>
      <sec>
        <title id="t-b17f82885b13">
          <bold id="strong-daffbf3a81354f30ba6d6a6da1a6fd13">Study Area</bold>
        </title>
        <p id="paragraph-eae396e562b94999848603587ddb075d">Bangalore is the fourth largest Municipal Corporation in India, responsible for managing a population of 6.8 million in an area of 741 km<sup id="superscript-648ae7f635cf4db4a15a7bd3b68a04ec">2</sup>. Its boundaries have expanded tenfold over the last six decades. It lies in the southeast of south Indian state of Karnataka (<xref id="x-b730ae5fb1e8" rid="figure-2b2770517e3e464ba26d9c92e1f549cd" ref-type="fig">Figure 1</xref>), Bangalore sits in the heart of Mysore Plateau at an average elevation of 900 meters. The city is located at 12°58′44″N latitude and 77°35′30″E longitude.</p>
        <fig id="figure-2b2770517e3e464ba26d9c92e1f549cd" orientation="portrait" fig-type="graphic" position="anchor">
          <label>Figure 1 </label>
          <caption id="caption-497389a91158484f99f920990b1d0daf">
            <title id="title-18ab58dc91184acbab86d9aaa1437052">
              <bold id="strong-65296338097445df89f440adaf56c51b">Study Area</bold>
            </title>
          </caption>
          <graphic id="graphic-3c690a49a5da4e5d904e56506a30cc1e" xlink:href="https://s3-us-west-2.amazonaws.com/typeset-prod-media-server/c01c2453-a1ec-4519-8e84-d247c28978d2image1.png"/>
        </fig>
      </sec>
      <sec>
        <title id="t-2059d29a671a">
          <bold id="strong-9fe5b957a9994d0992a4dc870f9ab722">Data</bold>
        </title>
        <p id="paragraph-e1508af4f28445999e0e5a4cc2f14f7b">This study uses Sentinel-2 satellite image, which provides medium resolution multispectral data, for classifying different LULC. Sentinal-2 satellite imagery provides spectral information across various wavelengths, including visible, near-infrared (NIR), and short-wave infrared bands (SWIR) bands. There are 13 spectral bands in Sentinel-2 image. The specifications of Sentinel-2 spectral bands are shown in <xref id="x-c2e78cf46b60" rid="table-wrap-3696facd8caf4dc8ad2455053384bde6" ref-type="table">Table 1</xref>.</p>
        <table-wrap id="table-wrap-3696facd8caf4dc8ad2455053384bde6" orientation="portrait">
          <label>Table 1</label>
          <caption id="caption-d135c8dbb9b04e8dae51b051a69e39c8">
            <title id="title-d0cac5301cc7437984add6cf136b89e6">
              <bold id="strong-2d5177648bcb4947986de72bbf9c9f9d">Specifications of Sentinal-2 spectral bands</bold>
            </title>
          </caption>
          <table id="table-a74ae919ebd24227ae3424cf14bd0d07" rules="rows">
            <colgroup>
              <col width="24.97"/>
              <col width="44.730000000000004"/>
              <col width="30.3"/>
            </colgroup>
            <tbody id="table-section-601eedd0194b48809cf58f8b9b78d722">
              <tr id="table-row-d8e82d9467774178849e040d0b4b007d">
                <td id="table-cell-f363c3a7929648aeb54a312378ba22ab" align="left">
                  <p id="paragraph-4f241a28d59540cdbff4231f50f4ae10"> <bold id="strong-b437fb077888433ca4c87a39815bff32">Bands</bold></p>
                </td>
                <td id="table-cell-ed0d0e320fdc4641b602c3d076fd2a79" align="left">
                  <p id="paragraph-040e476fc28141f58f02675e45a74252"> <bold id="strong-f66b5b14758149bbbdcc7e032e6a5fd3">Description</bold></p>
                </td>
                <td id="table-cell-c3eebe9bf6254a70986f443f545249e8" align="left">
                  <p id="paragraph-e4f19ebe75104bce8d592db404971d78"> <bold id="strong-67ace0e9de6047c6b49c631f25e71c02">Spatial resolution</bold></p>
                </td>
              </tr>
              <tr id="table-row-c9d17c0a7a924fa9b80a62097f57a787">
                <td id="table-cell-afb1d1fdbd3a4752b07fb239bcc5f4fa" align="left">
                  <p id="paragraph-5f91da83317246e79a6f9bb5f57d831f"> 2</p>
                </td>
                <td id="table-cell-94b94ebd7f294a26a6ffce5f12215f04" align="left">
                  <p id="paragraph-93998267b1324c4bb488178f8ceb1433"> Blue</p>
                </td>
                <td id="table-cell-b82257c2a9af4dc5bb09a76fd235416e" align="left">
                  <p id="paragraph-48a9a2d5591e44bcbdd3341ce085175f"> 10</p>
                </td>
              </tr>
              <tr id="table-row-f41ab54dcca5417eab8c11f22477aee0">
                <td id="table-cell-b6082f0fa2ee47f3bbf0bf1ef16330bc" align="left">
                  <p id="paragraph-7784d6fc8ae8455ba86694940ab2ab29"> 3</p>
                </td>
                <td id="table-cell-cd3b181a7d81499ea179691142e8c712" align="left">
                  <p id="paragraph-32562634f1e24d6191cfdc2c7987eea0"> Green </p>
                </td>
                <td id="table-cell-b0af1663b85b475e9d618c6d6190c362" align="left">
                  <p id="paragraph-ac7a5a51bc5049ceaa7b788f5b3f54d3"> 10</p>
                </td>
              </tr>
              <tr id="table-row-fb5ee35000be44078845f60e2de37b97">
                <td id="table-cell-68cf3b35905f4a5cab5e5adf8769d05f" align="left">
                  <p id="paragraph-63fab325c7bd4df7a96fbaa30df5aecb"> 4</p>
                </td>
                <td id="table-cell-78edd401d1164a2e971bb5fe96418d93" align="left">
                  <p id="paragraph-a239c6477c864db7a585c258e715f587"> Red</p>
                </td>
                <td id="table-cell-a9c37594ae5241c799a1f5751fe2d318" align="left">
                  <p id="paragraph-5414948d3f6f44ec936ad8a111dccddd"> 10</p>
                </td>
              </tr>
              <tr id="table-row-2f35571c72624c4191fddbad43834ddb">
                <td id="table-cell-e130ad9fabdf4c6b8ac117c4c8672e1c" align="left">
                  <p id="paragraph-e717654f41814b4891c84ce603044a3b"> 5</p>
                </td>
                <td id="table-cell-09dabf0d793d4bff998497143a6a7d2d" align="left">
                  <p id="paragraph-7eb56628c459473f8d434e8a48890e35"> Red Edge 1</p>
                </td>
                <td id="table-cell-8130183e52c348f69f7aa1f7b95f2f1b" align="left">
                  <p id="paragraph-0295bc4718c64e059b7ec4b2d70cdc92"> 20</p>
                </td>
              </tr>
              <tr id="table-row-257f2f0d48014e498a48f4b893c1a655">
                <td id="table-cell-37b15e25c2f84903a0eee6c3e3fa7bb7" align="left">
                  <p id="paragraph-7e9125962c694c298b41253d66e48768"> 6</p>
                </td>
                <td id="table-cell-41fa1621e51d4b97a8f8310c5132af25" align="left">
                  <p id="paragraph-d560aa9a60b74f85a35bf1bda8c386af"> Red Edge 2</p>
                </td>
                <td id="table-cell-fe1339402ec449b595d5a0346e819207" align="left">
                  <p id="paragraph-afbfb83d11294697822d1945c2c48e95"> 20</p>
                </td>
              </tr>
              <tr id="table-row-1cd17178ebcb4ab8bc360d13965fb4ac">
                <td id="table-cell-a1d2a5e78aad42f793ee64eaae1ea43b" align="left">
                  <p id="paragraph-d4ae673267d24552a2122681d2db0d0c"> 7</p>
                </td>
                <td id="table-cell-b66c4275ff7240939287c9d61313bbcf" align="left">
                  <p id="paragraph-f8c623a968ae48fa9fe9eb13567ffd75"> Red Edge 3</p>
                </td>
                <td id="table-cell-a0174511121448d29f5047f8778e1739" align="left">
                  <p id="paragraph-bf62b7f7d72c4f339b0f1e8115ae8693"> 20</p>
                </td>
              </tr>
              <tr id="table-row-270c288c604243c6b112f09d431c0b6b">
                <td id="table-cell-f068c8300fd04be8828c904ef382ea7e" align="left">
                  <p id="paragraph-3c00e2504a0642e98d729e5cee4df5c5"> 8</p>
                </td>
                <td id="table-cell-083aee5fa9594179818ed283e025759a" align="left">
                  <p id="paragraph-50ebe6fb56b94b85a1815cb7493de4b2"> NIR</p>
                </td>
                <td id="table-cell-9e39f66646c84553ad95442086e762de" align="left">
                  <p id="paragraph-5cf805abb9ac4fe5a896807dd4354409"> 10</p>
                </td>
              </tr>
              <tr id="table-row-ebc255b4b7124415a393e12f1f01ae65">
                <td id="table-cell-65cc452075824e3c978af8594b0d23a6" align="left">
                  <p id="paragraph-258d441b7892469a816ef72fbf555d43"> 11</p>
                </td>
                <td id="table-cell-6345af9c5b1a43ba887ab14a1bae63c5" align="left">
                  <p id="paragraph-c4ae280f46324f1ab0c9b3b7952e31d4"> SWIR</p>
                </td>
                <td id="table-cell-57fe8986fadb4c9eb823c9ba8bdce95c" align="left">
                  <p id="paragraph-9cc223f3cbc34e7bbf070af248cf8d16"> 20</p>
                </td>
              </tr>
              <tr id="table-row-3fb82535a9d745989108166c7f9ec4ac">
                <td id="table-cell-753e2207bcd94f6e88734b4dfae7c89c" align="left">
                  <p id="paragraph-f6716543427d45d997bdcf20f1d929ab"> 12</p>
                </td>
                <td id="table-cell-6bb69430f9ba42828836f927fc8df5fe" align="left">
                  <p id="paragraph-6f54d002635f449f8126a2825554bbac"> SWIR</p>
                </td>
                <td id="table-cell-d13459f29c044cd2992110704b9ad742" align="left">
                  <p id="paragraph-ccb44b6615a648c79b90898bbe53ea97"> 20</p>
                </td>
              </tr>
              <tr id="table-row-421184da83604c9dbc9b34d159f748d9">
                <td id="table-cell-520c1710232042b5922ca683862ab829" colspan="3" align="left">
                  <p id="paragraph-da3f646b6f26452ca2bd80e374f7e4d7"> NIR – Near- InfraRed reflectance </p>
                  <p id="p-4b144b98dd89">SWIR – Short wave InfraRed</p>
                </td>
              </tr>
            </tbody>
          </table>
        </table-wrap>
        <p id="paragraph-832ac329901247fbaa644a16af471c38">The Sentinel-2 image used in the study of LULV classification is accessed directly from Copernicus Open Access Hub using GEE (https://earthengine.google.com/ ) platform <xref id="xref-b8e3564596fc46c4995cf4ee380f0647" rid="R284032534135485" ref-type="bibr">8</xref>. The Sentinel-2 satellite image used in the study is dated form 2023-01-12 to 2024-03-07.</p>
      </sec>
      <sec>
        <title id="t-d8d248858ab0">
          <bold id="s-f26a87933e64">Methodology</bold>
        </title>
        <p id="paragraph-d6b3a3f84ea14b6a96bf38efac8c391b">Sentinel-2 satellite image used in the study are acquired from the European space Agency (ESA) Sentinel mission, specifically Sentinel-2A and Sentinel-2B satellites. These satellites orbit the Earth in a sun synchronous manner, capturing images with the revisit time of 5 days, ensuring frequent and consistent monitoring of the earth surface. Cloud masking techniques are applied to mitigate the impact of cloud cover. The spectral bands of Sentinel-2 image, coupled with its spatial resolution enable the discrimination of various land cover features, including water bodies, built-up area, barren land, vegetation. The step by step methodology followed in the present study on GEE platform is shown in <xref id="x-03ca94ece148" rid="figure-086d6d00beeb43eb8e7841d662bafa55" ref-type="fig">Figure 2</xref>.</p>
        <fig id="figure-086d6d00beeb43eb8e7841d662bafa55" orientation="portrait" fig-type="graphic" position="anchor">
          <label>Figure 2 </label>
          <caption id="caption-e1cdd0522a1b4c40b9e66a198baab632">
            <title id="title-d781112f84ee4564965f25e70923738e">
              <bold id="strong-f65e5a5629964eaa954430babca50755">Methodology for LULC classification using GEE platform</bold>
            </title>
          </caption>
          <graphic id="graphic-b5b7fbdef0f74ba597b9eb87d2b39832" xlink:href="https://s3-us-west-2.amazonaws.com/typeset-prod-media-server/c01c2453-a1ec-4519-8e84-d247c28978d2image2.png"/>
        </fig>
      </sec>
      <sec>
        <title id="t-028d13494b54">
          <bold id="strong-13735865e554428bbed6ad2778a022fb">Algorithms</bold>
        </title>
        <sec>
          <title id="t-74f1d3ba68aa">
            <bold id="strong-65dcbb4b9e14488f9fa912b3893f110c">Support Vector Machine (SVM):</bold>
          </title>
          <p id="paragraph-19dd6e4016ef449fa676f3f8b2d66574">SVM is a supervised learning algorithm used for performing classification and regression tasks, in the context of LULC classification <xref id="xref-2b57d7e6772a478ea6e70508bee24436" rid="R284032534135484" ref-type="bibr">9</xref>. In the present days, SVM has emerged as a popular method, as it has the ability to handle high dimensional data and nonlinear decision boundaries, it can handle datasets with large no of features, which makes it suitable for LULC classification. SVM algorithms are implemented on GEE platform to classify different land cover categories based on the spectral signatures extracted from satellite image. The advantage of SVM is that it is versatile in handling both linear and non linear classification problems. With the help of Kernel functions SVM can map input features into higher dimensional spaces, allowing for more complex decision boundaries that can better capture the underlying structure of data. SVM classifiers trained in GEE can be optimized using parameter tuning and feature selection, thereby improving the accuracy of classification. The ability of SVM to handle complex datasets and non linear decision boundaries makes it a valuable tool for accurately mapping land cover type and monitoring landscape change over time.</p>
        </sec>
        <sec>
          <title id="t-64ab2d2a2cb1">
            <bold id="strong-e43341d66908472981e02f52d82510f9">Random Forest (RF):</bold>
          </title>
          <p id="paragraph-549413ef00b44bb0b385a86ea34edb39">RF is widely used ensemble learning method employed in regression, classification, and other machine learning task. The importance of RF in LULC classification has gained prominence due to its ability to handle high dimensional data, complex decision boundaries, and mitigate over fitting. RF works by constructing multiple decision trees during the training phase. Each tree is trained on a subset of the dataset and makes individual prediction, such that the final prediction is determined by aggregating the prediction of each tree, commonly through a majority voting mechanism for classification task <xref id="xref-e511baa7b1604d94a90288f04eedccc4" rid="R284032534135486" ref-type="bibr">10</xref>. This ensemble approach reduces the risk of bias, thereby enhancing the robustness and accuracy of the classifier. LULC classification on GEE platform uses RF algorithm to classify different land covers based on spectral signatures extracted from the satellite imagery. </p>
        </sec>
        <sec>
          <title id="t-57e206615e48">
            <bold id="strong-97140320d47544399a6ac2701f2f261f">Accuracy </bold>
            <bold id="strong-69243867cd7f41868d7591b2bb9544e7">Assessment:</bold>
          </title>
          <p id="paragraph-bd82f34ba4964bd8a46f9a676a4e0205">Accuracy assessment is very important step in evaluating the reliability and quality of LULC classification derived from satellite imagery. Accuracy can be checked by employing several methods, including error matrix, confusion matrices, kappa coefficient, and user accuracy. In the present study accuracy assessment is done to evaluate the performance of the models used. The training sets composed for water bodies, urban, barren land, and vegetation have been scripted in JavaScript and divided into 80% training and 20% testing datasets. The performance of classification model is evaluated using confusion matrix, it provides a detailed breakdown of predicted and actual classes, allowing for calculation of various accuracy matrices. The overall accuracy (OA) and kappa coefficient (K<sub id="subscript-bab131d92f6a429c9af6ad14d74d895a">C</sub>) is calculated using the following equation <xref id="xref-534123d50ea44a9cb62cd33bffe7f145" rid="R284032534135489" ref-type="bibr">11</xref>. </p>
          <disp-formula-group id="disp-formula-group-0af95f9a5f5d4f9cbd05a32c1b6297ca"> <disp-formula><mml:math xmlns:mml="http://www.w3.org/1998/Math/MathML"><mml:mi>O</mml:mi><mml:mi>A</mml:mi><mml:mo>=</mml:mo><mml:mfenced><mml:mfrac><mml:msub><mml:mi>P</mml:mi><mml:mi>c</mml:mi></mml:msub><mml:msub><mml:mi>P</mml:mi><mml:mi>n</mml:mi></mml:msub></mml:mfrac></mml:mfenced><mml:mo>*</mml:mo><mml:mn>100</mml:mn></mml:math></disp-formula></disp-formula-group>
          <p id="paragraph-d3ddc90c94184323805d3e68194356d0"><inline-formula id="inline-formula-f92ced67324543ef901c36cd8ea99f69"> <mml:math xmlns:mml="http://www.w3.org/1998/Math/MathML"><mml:msub><mml:mi>P</mml:mi><mml:mi>c</mml:mi></mml:msub></mml:math></inline-formula> - Number of pixels classified correctly</p>
          <p id="paragraph-e44222a04dfd47acb99b49c37d40f7d5"><inline-formula id="inline-formula-d302bc963f29494f8c9f9003590f7c71"> <mml:math xmlns:mml="http://www.w3.org/1998/Math/MathML"><mml:msub><mml:mi>P</mml:mi><mml:mi>n</mml:mi></mml:msub></mml:math></inline-formula>- Total number of pixels</p>
          <disp-formula-group id="disp-formula-group-80fbff50d3d14fe8a271e0b4c5cc3dff"> <disp-formula><mml:math xmlns:mml="http://www.w3.org/1998/Math/MathML"><mml:msub><mml:mi>K</mml:mi><mml:mi>c</mml:mi></mml:msub><mml:mo>=</mml:mo><mml:mo> </mml:mo><mml:mfrac><mml:mrow><mml:mi>N</mml:mi><mml:mrow><mml:msubsup><mml:mo>∑</mml:mo><mml:mrow><mml:mi>i</mml:mi><mml:mo>=</mml:mo><mml:mn>1</mml:mn></mml:mrow><mml:mi>r</mml:mi></mml:msubsup><mml:msub><mml:mi>x</mml:mi><mml:mrow><mml:mi>i</mml:mi><mml:mi>i</mml:mi></mml:mrow></mml:msub></mml:mrow><mml:mo>-</mml:mo><mml:mrow><mml:msubsup><mml:mo>∑</mml:mo><mml:mrow><mml:mi>i</mml:mi><mml:mo>=</mml:mo><mml:mn>1</mml:mn></mml:mrow><mml:mi>r</mml:mi></mml:msubsup><mml:mrow><mml:mo>(</mml:mo><mml:msub><mml:mi>x</mml:mi><mml:mrow><mml:mi>i</mml:mi><mml:mo>+</mml:mo><mml:mo> </mml:mo></mml:mrow></mml:msub></mml:mrow></mml:mrow><mml:mi>x</mml:mi><mml:msub><mml:mi>x</mml:mi><mml:mrow><mml:mo>+</mml:mo><mml:mi>i</mml:mi></mml:mrow></mml:msub><mml:mo>)</mml:mo></mml:mrow><mml:mrow><mml:msup><mml:mi>N</mml:mi><mml:mn>2</mml:mn></mml:msup><mml:mo>-</mml:mo><mml:mrow><mml:msubsup><mml:mo>∑</mml:mo><mml:mrow><mml:mi>i</mml:mi><mml:mo>=</mml:mo><mml:mn>1</mml:mn></mml:mrow><mml:mi>r</mml:mi></mml:msubsup><mml:mrow><mml:mo>(</mml:mo><mml:msub><mml:mi>x</mml:mi><mml:mrow><mml:mi>i</mml:mi><mml:mo>+</mml:mo><mml:mo> </mml:mo></mml:mrow></mml:msub></mml:mrow></mml:mrow><mml:mi>x</mml:mi><mml:msub><mml:mi>x</mml:mi><mml:mrow><mml:mo>+</mml:mo><mml:mi>i</mml:mi></mml:mrow></mml:msub><mml:mo>)</mml:mo></mml:mrow></mml:mfrac></mml:math></disp-formula></disp-formula-group>
          <p id="paragraph-51846f44e63643a8882842d91671e196">Where, </p>
          <p id="paragraph-3ce2ca156f8f465cb99eb21577acbbeb">r - Number of rows and columns in the error matrix, </p>
          <p id="paragraph-c0bb6056dcdc4346855dcce34ea918d4"><inline-formula id="inline-formula-851f776e799a4bdb880ceecc469c8009"> <mml:math xmlns:mml="http://www.w3.org/1998/Math/MathML"><mml:msub><mml:mi>x</mml:mi><mml:mrow><mml:mi>i</mml:mi><mml:mi>i</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula>- Number of observation in row i and column i, </p>
          <p id="paragraph-ee1a505beef644ada4b20ae85ec7e0b1"><inline-formula id="inline-formula-15388ce7da4846759f66d005674bc686"> <mml:math xmlns:mml="http://www.w3.org/1998/Math/MathML"><mml:msub><mml:mi>x</mml:mi><mml:mrow><mml:mi>i</mml:mi><mml:mo>+</mml:mo></mml:mrow></mml:msub></mml:math></inline-formula>- Marginal total of row i, </p>
          <p id="paragraph-33dd1b48c839443e85f063c4935d5065"><inline-formula id="inline-formula-d23e6b09d15d4f41807d94527cc68312"> <mml:math xmlns:mml="http://www.w3.org/1998/Math/MathML"><mml:msub><mml:mi>x</mml:mi><mml:mrow><mml:mo>+</mml:mo><mml:mi>i</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula>- Marginal total of column i, and </p>
          <p id="paragraph-d1b4017cb6884fba99fec8f15d7c1f3a">N = Total number of observation.</p>
        </sec>
      </sec>
    </sec>
    <sec>
      <title id="title-f81c49dfa24348019588952084b7c220">3 Results</title>
      <p id="paragraph-f42244cc55d548f8afbe95e74a3b316c">The reseacrch dine eavluates the performance of the two machine learning techniques: SVM amd RF. In the present study four major types for land cover are classified water bodies, urban, abrren land and vegetation. The LULC classification of Bangalore district produced using RF anD SVM is shown in <xref id="x-526cee08cbe3" rid="figure-c6c0bd5bd6a8423480bcc195c6e962e4" ref-type="fig">Figure 3</xref>. The results of RF model shown in <xref id="x-dc9b6643b532" rid="figure-4cfd707ece9b4a43969a6153c8d2892b" ref-type="fig">Figure 4</xref>, revealed that 40.72 km<sup id="superscript-843eeab4886c4499a0251574fb8922d5">2</sup> is calssified as water bosied, 153.353 km<sup id="superscript-bc162a72483a45758ce7153c44190b27">2</sup> as urban, 29.9 km<sup id="superscript-66e25a56538c44788813d1139cd8eac0">2</sup> as barren land , and 580.882 km<sup id="superscript-df0be49280f3441f8c5cc16a463d43b7">2</sup> as vegetation. SVm resukts shown in <xref id="x-8809839929eb" rid="figure-519f4a9201e640f58274c5ab45cc1eaa" ref-type="fig">Figure 5</xref>, shows 41.37 km<sup id="superscript-994c74768fa248a785875fcecd6bef56">2</sup> is classified as water bodies, 1396.986 km<sup id="superscript-84d4d8e67ead40c6810cbeb7d5e4ea2e">2</sup> as urban, 155.11 km<sup id="superscript-3cb89d3623aa40e198d93c57d496a2cf">2 </sup>as barren land, and 618.723 km<sup id="superscript-7d573590f23f419b87065d13efd759b3">2 </sup>as surface vegetation.</p>
      <fig id="figure-c6c0bd5bd6a8423480bcc195c6e962e4" orientation="portrait" fig-type="graphic" position="anchor">
        <label>Figure 3 </label>
        <caption id="caption-8d51b9e2684e401ebd12a70863af3dfb">
          <title id="title-3a8664bf7f534e71a6ed8b8ed2f8136b">
            <bold id="strong-99259b7b8d8a4caebbefe1820585a4fc">(a) LULC classification using RF, (b) LULC classification using SVM</bold>
          </title>
        </caption>
        <graphic id="graphic-ba785f37b9024b7c965cc5179d4af129" xlink:href="https://s3-us-west-2.amazonaws.com/typeset-prod-media-server/c01c2453-a1ec-4519-8e84-d247c28978d2image3.png"/>
      </fig>
      <fig id="figure-4cfd707ece9b4a43969a6153c8d2892b" orientation="portrait" fig-type="graphic" position="anchor">
        <label>Figure 4 </label>
        <caption id="caption-92615b8171fc4829b919550addaeafe0">
          <title id="title-9af552d605fb4739b907b693bec05dff">
            <bold id="strong-712891f19ff84963aeac6084f7196efd">Area of LULC classes using RF</bold>
          </title>
        </caption>
        <graphic id="graphic-47d105b32aee427cbe20d928e69d63cb" xlink:href="https://s3-us-west-2.amazonaws.com/typeset-prod-media-server/c01c2453-a1ec-4519-8e84-d247c28978d2image4.png"/>
      </fig>
      <fig id="figure-519f4a9201e640f58274c5ab45cc1eaa" orientation="portrait" fig-type="graphic" position="anchor">
        <label>Figure 5 </label>
        <caption id="caption-86401bf588c24c5d8a5d5ec17731247a">
          <title id="title-5e5dcaf0bb2d40b89cd234b5f8dc3c67">
            <bold id="strong-af8f7f76e3774a2d8163edbaefcbb2eb">Area of LULC classes using SVM</bold>
          </title>
        </caption>
        <graphic id="graphic-a735bd5197cf4489938c781fc37c5f3e" xlink:href="https://s3-us-west-2.amazonaws.com/typeset-prod-media-server/c01c2453-a1ec-4519-8e84-d247c28978d2image5.png"/>
      </fig>
      <sec>
        <title id="t-d9a65e74a6af">
          <bold id="s-7d2dbf549dd8">Results Validation</bold>
        </title>
        <p id="paragraph-169241031b8f421e85c6757f7f41be57">After performing LULC classification the results obtained from RF and SVM algorithms were validated by determining overall accuracy (OA) and kappa coefficient (k<sub id="subscript-c150a67b0287485fb2b2048f20331985">c</sub>). The OA for RF was determined to be 89.74%, representing the proportion of correctly classified pixels across all land cover categories. The K<sub id="subscript-f9952192327c48afb1b3090cb7033e0f">c </sub>yielded a value of 0.87, indicating substantial agreement between observed and predicted classification. </p>
        <p id="paragraph-c25b52fffbc149e2966dbacde9d15122">OA for SVM model was found to be higher at 92.86% compared to RF , which shows that the great proportion of pixels are correctly classified compared to RF model. The k<sub id="subscript-35a2205c565a4dae8a68a9b11e2b9135">c</sub> yielded the value of .89, indicating a high level of agreement between observed and predicted classes. The study shows that the performance of SVM algorithm is better than RF. </p>
        <p id="paragraph-ff026ad0c1e84d11b733f7eeb955fe16">The results underscore the effectiveness of both RF and SVM models in accurately classifying LULC, where SVM showing slightly higher performance than RF. The inclusion of precision values further enriches the assessment, providing insight into the accuracy of individual classification of land cover within the model outputs.</p>
      </sec>
    </sec>
    <sec>
      <title id="title-a7a3430a5f3a4117bd073126798ed3c9">4 Conclusion</title>
      <p id="paragraph-c1007469e0564c9cb8ea7982cd2f7b41">In the study both RF and SVM algorithms demonstrated high accuracy in LULC classification, there by highlighting effectiveness in extracting meaningful information from satellite imagery. Leveraging Sentine-2 satellite image and GEE platform has revolutionized LULC mapping, by providing access to high resolution multispectral imagery and advanced processing capabilities. The accuracy of LULC mapping can be increased by providing more training points and the use of robust classification algorithms can enhance the reliability and precision of land cover classification.</p>
      <p id="paragraph-cfd5cabcc9d14acb9018e5ace8c9d201">Furthermore, continued advancement in remote sending technologies, machine learning algorithms, and geospatial techniques will further enhance the accuracy of LULC classification. By making use of Sentinel-2 satellite imagery, GEE platform, and advanced classification algorithms like RF and SVM, informed decisions can be made, thereby having sustainable land management practices, and also can address pressing environmental challenges on global scale.</p>
    </sec>
  </body>
  <back>
    <ref-list>
      <title>References</title>
      <ref id="R284032534135483">
        <element-citation publication-type="journal">
          <person-group person-group-type="author">
            <name>
              <surname>Feras</surname>
              <given-names> Mohammed</given-names>
            </name>
            <name>
              <surname>Baig</surname>
              <given-names>Muhammad Raza Ul</given-names>
            </name>
            <name>
              <surname>Mustafa</surname>
              <given-names>Imran</given-names>
            </name>
            <name>
              <surname>Baig</surname>
              <given-names>Husna</given-names>
            </name>
            <name>
              <surname>Takaijudin</surname>
              <given-names>Muhammad Binti</given-names>
            </name>
            <name>
              <surname>Zeshan</surname>
              <given-names> Talha</given-names>
            </name>
            <collab/>
          </person-group>
          <article-title>Assessment of Land Use Land Cover Changes and Future Predictions Using CA-ANN Simulation for Selangor, Malaysia</article-title>
          <source>Assessment of Land Use Land Cover Changes and Future Predictions Using CA-ANN Simulation for Selangor, Malaysia</source>
          <year>2022</year>
          <volume>14</volume>
          <issue>3</issue>
          <fpage>402</fpage>
          <publisher-name>MDPI Water</publisher-name>
          <uri>https://doi.org/10.3390/w14030402</uri>
        </element-citation>
      </ref>
      <ref id="R284032534135491">
        <element-citation publication-type="journal">
          <person-group person-group-type="author">
            <name>
              <surname>Kobayashi</surname>
              <given-names>Nobuyuki</given-names>
            </name>
            <name>
              <surname>Tani</surname>
              <given-names>Hiroshi</given-names>
            </name>
            <name>
              <surname>Wang</surname>
              <given-names>Xiufeng</given-names>
            </name>
            <name>
              <surname>Sonobe</surname>
              <given-names>Rei</given-names>
            </name>
            <collab/>
          </person-group>
          <article-title>Crop classification using spectral indices derived from Sentinel-2A imagery</article-title>
          <source>Journal of Information and Telecommunication</source>
          <year>2020</year>
          <volume>4</volume>
          <issue>1</issue>
          <fpage>67</fpage>
          <lpage>90</lpage>
          <issn>2475-1839, 2475-1847</issn>
          <publisher-name>Informa UK Limited</publisher-name>
          <uri>https://dx.doi.org/10.1080/24751839.2019.1694765</uri>
        </element-citation>
      </ref>
      <ref id="R284032534135488">
        <element-citation publication-type="journal">
          <person-group person-group-type="author">
            <name>
              <surname>Alshari</surname>
              <given-names>Eman A</given-names>
            </name>
            <name>
              <surname>Gawali</surname>
              <given-names>Bharti W</given-names>
            </name>
            <collab/>
          </person-group>
          <article-title>Analysis of Machine Learning Techniques for Sentinel-2A Satellite Images</article-title>
          <source>Journal of Electrical and Computer Engineering</source>
          <year>2022</year>
          <volume>2022</volume>
          <fpage>1</fpage>
          <lpage>16</lpage>
          <issn>2090-0155, 2090-0147</issn>
          <publisher-name>Wiley</publisher-name>
          <uri>https://dx.doi.org/10.1155/2022/9092299</uri>
        </element-citation>
      </ref>
      <ref id="R284032534135482">
        <element-citation publication-type="journal">
          <person-group person-group-type="author">
            <name>
              <surname>Nguyen</surname>
              <given-names>Huong Thi Thanh</given-names>
            </name>
            <name>
              <surname>Doan</surname>
              <given-names>Trung Minh</given-names>
            </name>
            <name>
              <surname>Tomppo</surname>
              <given-names>Erkki</given-names>
            </name>
            <name>
              <surname>McRoberts</surname>
              <given-names>Ronald E</given-names>
            </name>
            <collab/>
          </person-group>
          <article-title>Land Use/Land Cover Mapping Using Multitemporal Sentinel-2 Imagery and Four Classification Methods—A Case Study from Dak Nong, Vietnam</article-title>
          <source>Remote Sensing</source>
          <year>2020</year>
          <volume>12</volume>
          <issue>9</issue>
          <fpage>1367</fpage>
          <issn>2072-4292</issn>
          <publisher-name>MDPI AG</publisher-name>
          <uri>https://doi.org/10.3390/rs12091367</uri>
        </element-citation>
      </ref>
      <ref id="R284032534135487">
        <element-citation publication-type="journal">
          <person-group person-group-type="author">
            <name>
              <surname>Shafizadeh-Moghadam</surname>
              <given-names>Hossein</given-names>
            </name>
            <name>
              <surname>Khazaei</surname>
              <given-names>Morteza</given-names>
            </name>
            <name>
              <surname>Alavipanah</surname>
              <given-names>Seyed Kazem</given-names>
            </name>
            <name>
              <surname>Weng</surname>
              <given-names>Qihao</given-names>
            </name>
            <collab/>
          </person-group>
          <article-title>Google Earth Engine for large-scale land use and land cover mapping: an object-based classification approach using spectral, textural and topographical factors</article-title>
          <source>GIScience &amp; Remote Sensing</source>
          <year>2021</year>
          <volume>58</volume>
          <issue>6</issue>
          <fpage>914</fpage>
          <lpage>928</lpage>
          <issn>1548-1603, 1943-7226</issn>
          <publisher-name>Informa UK Limited</publisher-name>
          <uri>https://dx.doi.org/10.1080/15481603.2021.1947623</uri>
        </element-citation>
      </ref>
      <ref id="R284032534135490">
        <element-citation publication-type="journal">
          <person-group person-group-type="author">
            <name>
              <surname>Atef</surname>
              <given-names>Islam</given-names>
            </name>
            <name>
              <surname>Ahmed</surname>
              <given-names>Wael</given-names>
            </name>
            <name>
              <surname>Abdel-Maguid</surname>
              <given-names>Ramadan H</given-names>
            </name>
            <collab/>
          </person-group>
          <article-title>Modelling of land use land cover changes using machine learning and GIS techniques: a case study in El-Fayoum Governorate, Egypt</article-title>
          <source>Environmental Monitoring and Assessment</source>
          <year>2023</year>
          <volume>195</volume>
          <issue>6</issue>
          <issn>0167-6369, 1573-2959</issn>
          <publisher-name>Springer Science and Business Media LLC</publisher-name>
          <uri>https://dx.doi.org/10.1007/s10661-023-11224-7</uri>
        </element-citation>
      </ref>
      <ref id="R284032534135481">
        <element-citation publication-type="inproceedings">
          <person-group person-group-type="author">
            <name>
              <surname>Akhir</surname>
              <given-names>Nurul Syazna Mat</given-names>
            </name>
            <name>
              <surname>Salim</surname>
              <given-names>Pauziyah Mohammad</given-names>
            </name>
            <name>
              <surname>Yusoff</surname>
              <given-names>Zaharah Mohd</given-names>
            </name>
            <collab/>
          </person-group>
          <person-group person-group-type="editor"/>
          <article-title>Leveraging Google Earth Engine (GEE) for determining land use and land cover changes around Tasik Chini Malaysia</article-title>
          <source>9th International Conference on Geomatics and Geospatial Technology 2023 (GGT2023) </source>
          <year>2023</year>
          <volume>1240</volume>
          <series>IOP Conference Series: Earth and Environmental Science</series>
          <publisher-name>IOP Publ</publisher-name>
          <conf-loc>Kuala Lumpur, Malaysia</conf-loc>
          <conf-date>22/05/2023 - 25/05/2023 </conf-date>
          <fpage>012017</fpage>
          <uri>https://doi.org/10.1088/1755-1315/1240/1/012017</uri>
        </element-citation>
      </ref>
      <ref id="R284032534135485">
        <element-citation publication-type="journal">
          <person-group person-group-type="author">
            <name>
              <surname>Zhao</surname>
              <given-names>Zhewen</given-names>
            </name>
            <name>
              <surname>Islam</surname>
              <given-names>Fakhrul</given-names>
            </name>
            <name>
              <surname>Waseem</surname>
              <given-names>Liaqat Ali</given-names>
            </name>
            <name>
              <surname>Tariq</surname>
              <given-names>Aqil</given-names>
            </name>
            <name>
              <surname>Nawaz</surname>
              <given-names>Muhammad</given-names>
            </name>
            <name>
              <surname>Islam</surname>
              <given-names>Ijaz Ul</given-names>
            </name>
            <name>
              <surname>Bibi</surname>
              <given-names>Tehmina</given-names>
            </name>
            <name>
              <surname>Rehman</surname>
              <given-names>Nazir Ur</given-names>
            </name>
            <name>
              <surname>Ahmad</surname>
              <given-names>Waqar</given-names>
            </name>
            <name>
              <surname>Aslam</surname>
              <given-names>Rana Waqar</given-names>
            </name>
            <name>
              <surname>Raza</surname>
              <given-names>Danish</given-names>
            </name>
            <name>
              <surname>Hatamleh</surname>
              <given-names>Wesam Atef</given-names>
            </name>
            <collab/>
          </person-group>
          <article-title>Comparison of Three Machine Learning Algorithms Using Google Earth Engine for Land Use Land Cover Classification</article-title>
          <source>Rangeland Ecology &amp; Management</source>
          <year>2024</year>
          <volume>92</volume>
          <fpage>129</fpage>
          <lpage>137</lpage>
          <issn>1550-7424</issn>
          <publisher-name>Elsevier BV</publisher-name>
          <uri>https://dx.doi.org/10.1016/j.rama.2023.10.007</uri>
        </element-citation>
      </ref>
      <ref id="R284032534135484">
        <element-citation publication-type="journal">
          <person-group person-group-type="author">
            <name>
              <surname>Kundu</surname>
              <given-names>Susanta</given-names>
            </name>
            <name>
              <surname>Kumar</surname>
              <given-names>Vinod</given-names>
            </name>
            <collab/>
          </person-group>
          <article-title>Supervised Classification- An Overview of Machine Learning in Agricultural Practices</article-title>
          <source>Bulletin of Environment, Pharmacology and Life Sciences</source>
          <year>2022</year>
          <issue>5</issue>
          <fpage>398</fpage>
          <lpage>406</lpage>
          <uri>https://bepls.com/spl(5)2022/70.pdf</uri>
        </element-citation>
      </ref>
      <ref id="R284032534135486">
        <element-citation publication-type="misc">
          <person-group person-group-type="author">
            <name>
              <surname>Donges</surname>
              <given-names>Niklas</given-names>
            </name>
            <collab/>
          </person-group>
          <article-title>Random Forest: A Complete Guide for Machine Learning</article-title>
          <year>Mar. 08, 2024</year>
          <uri>https://builtin.com/data-science/random-forest-algorithm</uri>
        </element-citation>
      </ref>
      <ref id="R284032534135489">
        <element-citation publication-type="journal">
          <person-group person-group-type="author">
            <name>
              <surname>Aldiansyah</surname>
              <given-names>Septianto</given-names>
            </name>
            <name>
              <surname>Saputra</surname>
              <given-names>Randi Adrian</given-names>
            </name>
            <collab/>
          </person-group>
          <article-title>Comparison Of Machine Learning Algorithms For Land Use And Land Cover Analysis Using Google Earth Engine (Case Study: Wanggu Watershed)</article-title>
          <source>International Journal of Remote Sensing and Earth Sciences (IJReSES)</source>
          <year>2022</year>
          <volume>19</volume>
          <issue>2</issue>
          <issn>2549-516X, 0216-6739</issn>
          <publisher-name>National Research and Innovation Agency</publisher-name>
          <uri>https://dx.doi.org/10.30536/j.ijreses.2022.v19.a3803</uri>
        </element-citation>
      </ref>
    </ref-list>
  </back>
</article>
