Esri has been releasing more and more MOOC over the span of 2 years to accommodate its increasingly large expanse of products within the ArcGIS ecosystem.
But of all the MOOCs that I've participated in, 'Do-It-Yourself Geo App MOOC' must be the most underrated ones produced by Esri Training. The functionalities highlighted within the MOOC took the anthem right off their recent Esri UC 2020 that went virtual. The curriculum includes:
The creation of hosted feature layer (without utilizing any GIS software medium like ArcMap or ArcGIS Pro).
The basics of the ArcGIS Online platform ecosystem:
hosted feature layer > web map > web app
Basically, to view a hosted feature layer, you will need to drag it onto a 'Map' and save it as a web map.
Conventionally, web map suffices for the visualization and analytical work for the likes of any geospatialist who are familiar with Web GIS.
But this time, Esri is highlighting a brand new web map product called 'Map Viewer Beta'. Why beta? Cause it is still in beta version but so sleeky cool that they just had to let every have a shot at using it. Truth be told, Map Viewer Beta did not disappoint.
Even so, Map Viewer Beta still has some functionalities that have yet to be implemented.
Using web map to visualize data, configure pop-up, execute simple analysis and extending it to Map Viewer Beta interface
Utilizing Survey123 for crowdsourcing data; the first level of citizen science and creating a webmap out of it.
Creating native apps using AppStudio for ArcGIS; no coding required.
Some tidbits on accessing the ArcGIS API for JavaScript
I love how cool it is that this MOOC actually shows you step-by-step on how to use the new Map Viewer Beta and explain the hierarchy of formats for the published content in the ArcGIS Online platform
I have established my understanding of ArcGIS Online ecosystem 3 years back but I do find it awkward that such powerful information is not actually summarized in a way that is comprehensible for users that have every intention of delving into Web GIS. And Web GIS is the future with all the parallel servers that could handle the processing/analysis of large amount of data. ArcGIS Online is a simplified platform that provides interfaces for the fresh-eyed new geospatial professionals.
It is quite well-know for the fact that there has been some criticism as to the domination of Esri within the GIS tools/resources within the geospatial science industry, but I believe it is something we could take as a pinch of salt. Not everything in Esri's massive line of commercial products are superior to other platforms but it is a starting point for any new geospatialists who wants to explore technologies there are not familiar with.
All in all, this MOOC is heaven-sent. For me, I have been playing with the web apps and web maps for close to 4 years and I can attest to the fact that it covers all the basics. For the developer's bit, maybe not so much as going through it in a distinct step-by-step but it does stoke the curiosity as to how it works. The question is, how do we make it work. Now that's a mystery I am eager to solve.
I'm going to put this on my ever-expanding to-do list and think JavaScript for another few more months of testing out this ArcGIS API for JavaScript implementation. Tell me if you wanna know how this actually works and I'll share what I find out when I do.
For those who had missed out on this cohort, fear not. This MOOC runs twice a year and the next cohort is going to be from Feb 17 to March 17 2021. The registration is already open, so don’t hold back and click the link below:
Do-It-Yourself Geo Apps
Do register for a public account before signing up or just click 'Register' at the MOOC's page and it's open the open to either sign in or 'Create a public account'. It was a blast and I'm sure, if you've never used any of the feature I've mentioned above, you'll be as wide-eyed as I was 3 years ago. :D
Till then, stay spatially mappy comrades!
P/S: If you complete all the assignments and quizzes, you'll get a certificate of completion from Esri. Which is pretty rad!
There are alot of Python courses out there that we can jump into and get started with. But to a certain extent in that attempt to learn the language, the process becomes unbearably long and frustratingly slow. We all know the feeling of wanting to run before we could learn how to walk; we really wanna get started with some subtantial project but we do not know enough to even call the data into the terminal for viewing.
Back in August, freeCodeCamp in collaboration with Jovian.ai, organized a very interesting 6-week MOOC called Data Analysis with Python: Zero to Pandas and as a self-proclaimed Python groupie, I pledged my allegiance!
If there are any expectation that I've managed to whizz myself through the course and obtained a certificate, nothing of that sort happened; I missed the deadline cause I was busy testing out every single code I found and work had my brain on overdrive. I can't...I just...can't. Even with the extension, I was short of 2 Pythonic answers required to earn the certificate. But don't mistake my blunders for the quality of the content this course has to offer; is worth every gratitude of its graduates!
Zero to Pandas MOOC is a course that spans over 6 weeks with one lecture webinar per week that compacts the basics of Python modules that are relevant in executing data analysis. Like the play on its name, this course assumes no prior knowledge in Python language and aims to teach prospective students the basics on Python language structure AND the steps in analyzing real data. The course does not pretend that data analytics is easy and cut-corners to simplify anything. It is a very 'honest' demonstration that effectively gives overly ambitious future data analysts a flick on the forehead about data analysis. Who are we kidding? Data analysis using programming language requires sturdy knowledge in some nifty codes clean, splice and feature engineer the raw data and real critical thinking on figuring out 'Pythonic' ways to answer analytical questions. What does it even mean by Pythonic ways? Please refer to this article by Robert Clark, How to be Pythonic and Why You Should Care. We can discuss it somewhere down the line, when I am more experienced to understand it better. But for now, Packt Hub has the more comprehensive simple answer; it simply is an adjective coined to describe a way/code/structure of a code that utilizes or take advantage of the Python idioms well and displays the natural fluency in the language.
The bottom line is, we want to be able to fully utilize Python in its context and using its idioms to analyze data.
The course is conducted at Jovian.ai platform by its founder; Aakash and it takes advantage of Jupyter-like notebook format; Binder, in addition to making the synchronization available at Kaggle and Google's Colab. Each webinar in this course spans over close to 2 hours and each week, there are assignments on the lecture given. The assignments are due in a week but given the very disproportionate ratio of students and instructors, there were some extensions on the submission dates that I truly was grateful for. Forum for students is available at Jovian to engage students into discussing their ideas and question and the teaching body also conducts office hours where students can actively ask questions.
The instructor's method of teaching is something I believe to be effective for technical learners. In each lectures, he will be teaching the codes and module requires to execute certain tasks in the thorough procedure of the data analysis task itself. From importing the .csv formatted data into Python to establishing navigation to the data repository...from explaining what the hell loops are to touching base with creating functions. All in the controlled context of two most important module for the real objective of this course; Numpy and Pandas.
My gain from this course is immensely vast and that's why I truly think that freeCodeCamp and Jovian.ai really put the word 'tea' to 'teachers'. Taking advantage of the fact that people are involuntarily quarantined in their house, this course is something that should not be placed aside in the 'LATER' basket. I managed to clear my head to understand what 'loop' is! So I do think it can solve the world's problem!
In conclusion, this is the best course I have ever completed (90%!) on data analysis using Python. I look forward to attending it again and really finish up that last coursework.
Oh. Did I not mention why I got stuck? It was the last coursework. We are required to demonstrate all the steps of data analysis on data of our choice, create 5 questions and answer them using what we've learned throughout the course. Easy eh? Well, I've always had the tendency of digging my own grave everytime I get awesome cool assignments. But I'm not saying I did not do it :). Have a look-see at this notebook and consider the possibilities you can grasp after you've completed the course. And that's just my work...I'm a standard C-grade student.
And the exciting latest news from Jovian.ai is that they have upcoming course at Jovian for Deep Learning called Deep Learning with PyTorch: Zero to GANS! That's actually yesterday's news since they organized it earlier this year...so yeah...this is an impending second cohort! Tentatively, the course will start on Nov 14th. Click the link below to sign-up and get ready to attack the nitty-gritty. Don't say I didn't warn ya.
And that's me, reporting live from the confinement of COVID pandemic somewhere in a developing country at Southeast Asia....
Tool: ArcGIS Pro 2.9.3, Operations Dashboard ArcGIS & ArcGIS Online Technique: Data transformation and geometric calculation
WWF-Malaysia Forest Cover Baseline is a dashboard of forest cover extent status in selected land uses across Malaysia's region, methodology of analysis and resources involved in the exercise.
The WWF-Malaysia Forest Cover Baseline and Forest Cover Key Performance Index (KPI) is a task undertaken by the Conservation Geographical Information System (CGIS) Unit to amass the discrete information of forest cover extent across Malaysia's 3 main region of legislation: Peninsular Malaysia, Sarawak and Sabah. This exercise produces a concise dashboard report in an online platform that describes the processed information on the forest cover status as well as their prospective areas identified for conservation work.
Report can be interactively accessed at the following:
The dashboard can be accessed at Malaysia Forest Cover 2020.
📌 Availability: Retracted in 2021
Here’s a quick run down of what you’re supposed to do to prepare yourself to use Python for data analysis.
Install Python ☑
Install Miniconda ☑
Install the basic Python libraries ☑
Create new environment for your workspace
Install geospatial Python libraries
Let’s cut to the chase. It’s December 14th, 2021. Python 3 is currently at 3.10.1 version. It’s a great milestone for Python 3 but there were heresay of issues concerning 3.10 when it comes to using it with conda. Since we’re using conda for our Python libraries and environment management, we stay safe by installing Python 3.9.5.
Download 👉🏻 Python 3.10.1 if you want to give a hand at some adventurous troubleshooting
Or download 👉🏻 Python 3.9.5 for something quite fuss-free
📌 During installation, don’t forget to ✔ the option Add Python 3.x to PATH. This enables you to access your Python from the command prompt.
As a beginner, you’ll be informed that Anaconda is the easiest Python library manager GUI to implement conda and where it contains all the core and scientific libraries you ever need for your data analysis upon installation. So far, I believe it’s unnecessarily heavy, the GUI isn’t too friendly and I don’t use most of the pre-installed libraries. So after a few years in the darkness about it, I resorted to jump-ship and use the skimped version of conda; Miniconda.
Yes, it does come with the warning that you should have some sort of experience with Python to know what core libraries you need. And that’s the beauty of it. We’ll get to installing those libraries in the next section.
◾ If you’re skeptical about installing libraries from scratch, you can download 👉🏻 Anaconda Individual Edition directly and install it without issues; it takes some time to download due to the big file and a tad bit longer to install.
◾ Download 👉🏻 Miniconda if you’re up to the challenge.
📌 After you’ve installed Miniconda, you will find that it is installed under the Anaconda folder at your Windows Start. By this time, you will already have Python 3 and Anaconda ready in your computer. Next we’ll jump into installing the basic Python libraries necessary for core data analysis and create an environment to house the geospatial libraries.
Core libraries for data analysis in Python are the followings:
🔺 numpy: a Python library that enables scientific computing by handling multidimensional array objects, or masked objects including matrices and all the mathematical processes involved.
🔺 pandas: enables the handling of ‘relational’ or 'labeled’ data structure in a flexible and intuitive manner. Basically enables the handling of data in a tabular structure similar to what we see in Excel.
🔺matplotlib: a robust library that helps with the visualization of data; static, animated or interactive. It’s a fun library to explore.
🔺 seaborn: another visualization library that is built based on matplotlib which is more high-level and produces more crowd-appealing visualization. Subject to preference though.
🔺 jupyter lab: a web-based user interface for Project Jupyter where you can work with documents, text editors, terminals and or Jupyter Notebooks. We are installing this library to tap into the notebook package that is available with this library installation
To start installing:
1️⃣ At Start, access the Anaconda folder > Select Anaconda Prompt (miniconda3)
2️⃣ An Anaconda Prompt window similar to Windows command prompt will open > Navigate to the folder you would like to keep your analytics workspace using the following common command prompt codes:
◽ To backtrack folder location 👇🏻
◽ Change the current drive, to x drive 👇🏻
◽ Navigate to certain folders of interest e.g deeper from Lea folder i.e Lea\folder_x\folder_y 👇🏻
3️⃣ Once navigated to the folder of choice, you can start installing all of the libraries in a single command as follows:
The command above will enable the simultaneous installation of all the essential Python libraries needed by any data scientists.
💀 Should there be any issues during the installation such as uncharacteristically long installation time; 1 hour is stretching it, press Ctrl + c to cancel any pending processes and proceed to retry by installing the library one by one i.e
Once you manage to go through the installation of the basic Python libraries above, you are half way there! With these packages, you are already set to actually make some pretty serious data analysis. The numpy, pandas and matplotlib libraries are the triple threat for exploratory data analysis (EDA) processes and the jupyter lab library provides the documentation sans coding notebook that is shareable and editable among team mates or colleagues.
Since we’re the folks who like to make ourselves miserable with the spatial details of our data, we will climb up another 2 hurdles to creating a geospatial workspace using conda and installing the libraries needed for geospatial EDA.
If you're issues following the steps here, check out the real-time demonstration of the installations at this link 👇🏻
See you guys in part 2 soon!
Tool: ArcGIS Pro 2.6.3 Technique: Symbolization, labeling and SQL expression
MBR 2023 is a peak event that culminates all the effort of data collection and stock take of hydrocarbon resource in the Malaysia. It is an annual event that put together all the exploration blocks, discoverable hydrocarbon fields and late life assets for upstream sectors to evaluate and invest in.
Leading up to the event, the Malaysia Petroleum Management (MPM) updates, re-evaluate and produces maps; static and digital, to cater to the need for the most update stock-take of information that can be gained from various source of exploration output; seismic, full tensor gradiometry, assets; cables, pipelines, platforms, as well as discoverable resources. This year's them aims to include various prospects and initiative to align the industry itself with lower carbon emission and to explore the option for carbon capture storage (CCS) attempts in the popular basins such as the Malay and Penyu Basin. This is a big follow-up with the closing of MBR 2022 with the PSC signing for 9 blocks a few days earlier.
Credit: Sh Shahira Wafa Syed Khairulmunir Wafa
Over ~70 maps for unique blocks have been produced during the finalization stage, ~210 maps during data evaluation and additional 20 for the event. And this excludes the standardized maps to formalize information requested by prospective bidders as well as clients who are facing prospects of extending their contract.
The standardization of the map requires the optimization of workflow and standard templates to cater to rapid changes and exporting to rapid output.
For more information on the event, please access the following resources:
PETRONAS: Malaysia Bid Round
PETRONAS myPROdata
The Malaysian Reserve: Petronas offers 10 exploration blocks in MBR 2023
Peta Gunatanah Malaysia 2014 -2018 ("Malaysia's Land Cover 2014 - 2018") web application is a platform generated for the Quality Assessment activity organized by Forest Research Institute Malaysia (FRIM) on 23rd June 2024.
The workshop aims to collect field/reference data from Malaysian's state agencies in the effort to verify the quality of the land cover classification output generated in support of CO2 release measurement from converted agricultural lands.
Participants are able access the app via conventional browsers from their mobile devices and submit drawings/sketches that they have captured within interactive data layers.
This web app aims to support direct input from source onto the task of improving the accuracy of the generated land cover maps. Vectors generated from this exercise are readily standardized with the required data scheme from quality assessment, making full use of the ArcGIS Online ecosystem full to a produce concrete output and actionable information.
Coding is one of the things I have aspired to do since like...forever! But finding a resource in-sync with my comprehension, schedule and able to retain my interest long enough is a challenge.
I have the attention span of a gnat so, I jumped everywhere! If I am not actively engaged with the learning, I just can't do it. And I know...we have DataCamp, Udemy, Khan Academy and even Kaggle...but I either can't keep up, too poor to pay for the full course or it couldn't sync with me enough. I believe I can say that most of the exercise doesn't 'vibe' with me.
Recently, I committed myself to my one passion; running. It's one of my favorite activities when I was back in school but the will to really run died a decade ago. I have recently picked up my running shoes and ran my little heart out despite having the speed of a running ant; aging perhaps? And I owe my hardcore will to the motivation of earning what I paid when I decided to join a 1-month long virtual run of 65km. It is called the 'Pave Your Path' virtual run organized by
Running Station
. Nailed it 2 days ago after 13 sessions of 5km - yes, you can accumulate the distance from multiple runs. It made me realize that...it's not that bad. The 'near-death' experience while running kinda turned me into a daredevil these days when it comes to undertaking some things I'd whine about doing a few months back.
"If I can go through dying every single evening for 5km long run...I can handle this,"
My thoughts exactly every time I feel so reluctant to finish some tasks I believe I could hold off for some time.
Naturally, I plan my work rigorously and despite the flexibility of my schedule and my detailed plans, I still have a hard time trying to nail the last coffin to my projects. Usually, it's due to my brain's exhaustion from overthinking or I am just truly tired physically. Which is a weird situation given I do not farm for a living. Even so, I was lethargic all the time.
But when I started running a month ago, things kind of fall into places for me. Maybe...just maybe...I've become more alert than I used to. I still have my ignorance of things that I believe do not concern my immediate attention but I seem to be able to network my thoughts faster than I used to.
It might be just me, feeling like a new person due to my sheer willpower to not burn my RM60 paid for the virtual run, but it did feel like there was a change.
For that, I managed to confirm what I have suspected all along - I am one of those people who love drills. I like things to be drilled into my head until I by-heart it into efficiency and then focus on polishing the effectiveness.
Thus...for coding, I committed myself to
freeCodeCamp
. By hook or by crook, I'll be coding by first quarter next year or someone's head is gonna roll!
It's an interactive learning experience simple enough for me to start, straightforward enough to not make me waste my time searching for answers and it's free. God bless Quincy Larson.
Going back to the program outlined in freeCodeCamp, I find it fascinating that they start off with HTML. I have no arguments there. My impatience made me learn my lesson - you run too fast, you're going to burn out painfully and drop dead before you halfway through. HTML is a very gentle introduction to coding for newbies since it's like LEGO building blocks where you arrange blocks and match two to create something. I didn't have to go crazy with frustration is I don't 'get' it. Yes, we would all want some Python lovin' and I think alot of coders I came to know have raved about how simple it is to learn. But I think, it is an opinion shared by 'experienced' coders who wished Python was there when they first started coding. Someone once told me, what you think is the best based on others' experiences may not be the best for you...and I agree with this. After alot of deliberations and patience at my end, starting over again this time feels, unlike the dreaded looming doom I've always had back then.
Are you into coding? What do you code and what's you're language preference? Where did you learn coding? Feel free to share with me!
Yes peeps. I’ve been studying and on contrary to all my previous attempts to make beautiful notes, I say f it and just work with what helps me clear my head the fastest 🏃🏻♀️. I love writing notes, but I realize, to gather my thoughts properly, I need some sort of way to not waste paper just to arrange and rearrange my ideas or comprehension of things.
What better way of doing that than using a mind map!
So you kiddos out there who are starting out with Python and just can’t wait to get into deep learning or machine learning, I’d say, hold your horses for a minute and have some preview of that pond you’re trying to jump into. And don’t be scared, cause we’re all friends here in the hell-hole of learning plateau. Will it get better? I believe so. I am positive I understand more of the principles of deep learning and the relevance of Python libraries associated with it. Yes...this is a Python bar, darling. 👩🏻💻
There’s no real shortcut if you ask me since we have different way of comprehending things; my pre-existing mold may have harder time grasping the things I am learning right now than you would. So don’t be afraid to doodle while you think. No amount of paper will be enough to help you understand things, so better start being sustainable by using some digital platforms and saving those papers to when you’re truly ready to pen out your understanding of things; not what you read. There’s a difference!
Check out the mind map of some essential Python libraries you can get started with before you start doing some deep learning. It’s worth reviewing all that prior, I promise.
Have fun! 🙆🏻♀️
Hunting for spatial data comes naturally now. There seems to be less and less opportunity for doubts when we could attach a pair of coordinates to some places.
For work and hobby, hunting for data take almost half of the usable hours I set aside to execute certain objectives; if not 100%. Although the internet is a vast plain of data, not all of them are usable. The democratization of data is a subject that is to translucent to discuss but to solid to argue with. Thus, with differing opinions, we get different versions of them online. Here are some of the interesting data platforms I manage to scour based on their thematic subject
🌳 Nature and Environment
Delta at Risk - Profiling Risk and Sustainability of Coastal Deltas of the World. I found this while lamenting on how people love asking for data addition into their maps at the eleventh hour. I find their confidence in my skills quite misleading but flattering nonetheless. But it does not make it any less troublesome.
Protected Planet - Discover the world's protected and conserved areas. This platform includes not just data of protected areas, but also other effective area-based conservation measures like ICCAs IUCN listing and as the website claims, it is updated regular via submissions from agencies. So far, I found this platform to be the most convenient since it rounds up all possible conservation-based themes which also includes World Heritage Sites.
Global Forest Change (2000-2020) - The global forest extent change since 2000 to the current year or lovingly referred to as the Hansen data by most forestry RS specialist. This data is updated annually and to be honest, the platforms are literally everywhere. But this platform is legitimate under Earth Engine Apps and you can refer to Google Earth Engine for future data updates to ease your search.
👩⚖️ Administrative Data
GADM - Map and spatial data for all countries and their sub-divisions.
🏦 Built-environment Data
OpenStreet Map - This database is the most amazing feat of tech-aware crowdsourcing. A little more than 2 decades ago, some 'experienced' gate-keeping professionals would have refuted its legitimacy within an inch of their lives but OSM has proven that time prevails when it comes to bringing the accessibility and network data into practical use. I am not that adept with downloading from this website so I go directly to a more manual data download. My favorite is the Geofabrik Download but you can also try Planet OSM.
🎮 Other Cool Data
OpenCell ID - Open database platform of global cell towers. Cleaning the data is a nightmare but I think it is just me. I have little patience for cerebral stuff.
So, those are some of the data I managed to dig for personal projects. Hope it helps you guys too!
🟢 Beginner-friendly.
🆓 Free with no hidden monetary cost.
🤚🏻 Requires registration so sign-up 👉🏻https://signup.earthengine.google.com/, access via browser and Internet connection
🖥️ Available for Windows, Mac and Linux.
Google Earth Engine or lovingly called GEE is another free and open platform provided by Google to provide a very vast and comprehensive collection of earth observation data. Since Sentinel-2 is no longer available for download at USGS Earth Explorer, I find the alternative too challenging for me so GEE seems like the easiest way to go. If you're looking for a one-stop platform to access satellite imagery for free, GEE is a great place to start. You don't have to learn JavaScript explicitly to start using this tool.