The ‘Eyes on the Forest’ Sarawak web application is developed with the joint effort of WWF-Malaysia and WWF-Japan. Emulating the successful ‘Eyes on the Forest’ Sumatra database, this web app intends to provide an undiluted information on the rich diversity of Sarawak’s unique wildlife, forests and land uses as well as increasing the transparency on the threats; deforestation, infrastructure developments and urbanization, as well as its drivers; concessions, agricultural activities and unsupervised resource harvesting.
Identifying where the threats, the drivers and the conservation targets are located will help in making a concrete point at the policy level and push for more boots on the ground to ensure that the protected areas stay protected. The baseline information provided are essential in the prompt intervention for biodiversity protection.
Data collected and consolidated from dispersed public sources and engineered for simple and direct visualization. Values are generated based on verified legislated information provided by stakeholders and state agencies to the Sarawak Conservation Programme (SCP) for WWF-Malaysia.
Developed and maintained by:
Azalea Kamellia
GIS Officer (SCP) | WWF- Malaysia
June 2018 - Present
Last year, I participated once again in the 30 Day Map Challenge that was going around in Twitter-ville come November. It is the 3rd attempt at the marathon and 2022 served as a reminder that progressed too despite getting stuck at Day 3 as life caught up with me.
I don't like the idea that I have left the challenge incomplete, again. It was not my priority and I work better with clear goals or visions of expected output. If it does not add to my need to learn something new ...it will be a task bound to head straight to the backburner. Let's resolve to make it a long-term routine instead of a spurt of stress trying to make the deadline.
As a consequence, I am attuning this task into one that actually gives me the benefit out putting into record the techniques and tools I used to make the maps in writing. I believe that will serve more purpose and added value other than visuals. And perhaps, have some stock ready for submission this year instead.
Anyone else participated in this challenge back in November? How did you do and what would you like to do better for the next one? Don't be shy and do drop a word or two.
There are alot of Python courses out there that we can jump into and get started with. But to a certain extent in that attempt to learn the language, the process becomes unbearably long and frustratingly slow. We all know the feeling of wanting to run before we could learn how to walk; we really wanna get started with some subtantial project but we do not know enough to even call the data into the terminal for viewing.
Back in August, freeCodeCamp in collaboration with Jovian.ai, organized a very interesting 6-week MOOC called Data Analysis with Python: Zero to Pandas and as a self-proclaimed Python groupie, I pledged my allegiance!
If there are any expectation that I've managed to whizz myself through the course and obtained a certificate, nothing of that sort happened; I missed the deadline cause I was busy testing out every single code I found and work had my brain on overdrive. I can't...I just...can't. Even with the extension, I was short of 2 Pythonic answers required to earn the certificate. But don't mistake my blunders for the quality of the content this course has to offer; is worth every gratitude of its graduates!
Zero to Pandas MOOC is a course that spans over 6 weeks with one lecture webinar per week that compacts the basics of Python modules that are relevant in executing data analysis. Like the play on its name, this course assumes no prior knowledge in Python language and aims to teach prospective students the basics on Python language structure AND the steps in analyzing real data. The course does not pretend that data analytics is easy and cut-corners to simplify anything. It is a very 'honest' demonstration that effectively gives overly ambitious future data analysts a flick on the forehead about data analysis. Who are we kidding? Data analysis using programming language requires sturdy knowledge in some nifty codes clean, splice and feature engineer the raw data and real critical thinking on figuring out 'Pythonic' ways to answer analytical questions. What does it even mean by Pythonic ways? Please refer to this article by Robert Clark, How to be Pythonic and Why You Should Care. We can discuss it somewhere down the line, when I am more experienced to understand it better. But for now, Packt Hub has the more comprehensive simple answer; it simply is an adjective coined to describe a way/code/structure of a code that utilizes or take advantage of the Python idioms well and displays the natural fluency in the language.
The bottom line is, we want to be able to fully utilize Python in its context and using its idioms to analyze data.
The course is conducted at Jovian.ai platform by its founder; Aakash and it takes advantage of Jupyter-like notebook format; Binder, in addition to making the synchronization available at Kaggle and Google's Colab. Each webinar in this course spans over close to 2 hours and each week, there are assignments on the lecture given. The assignments are due in a week but given the very disproportionate ratio of students and instructors, there were some extensions on the submission dates that I truly was grateful for. Forum for students is available at Jovian to engage students into discussing their ideas and question and the teaching body also conducts office hours where students can actively ask questions.
The instructor's method of teaching is something I believe to be effective for technical learners. In each lectures, he will be teaching the codes and module requires to execute certain tasks in the thorough procedure of the data analysis task itself. From importing the .csv formatted data into Python to establishing navigation to the data repository...from explaining what the hell loops are to touching base with creating functions. All in the controlled context of two most important module for the real objective of this course; Numpy and Pandas.
My gain from this course is immensely vast and that's why I truly think that freeCodeCamp and Jovian.ai really put the word 'tea' to 'teachers'. Taking advantage of the fact that people are involuntarily quarantined in their house, this course is something that should not be placed aside in the 'LATER' basket. I managed to clear my head to understand what 'loop' is! So I do think it can solve the world's problem!
In conclusion, this is the best course I have ever completed (90%!) on data analysis using Python. I look forward to attending it again and really finish up that last coursework.
Oh. Did I not mention why I got stuck? It was the last coursework. We are required to demonstrate all the steps of data analysis on data of our choice, create 5 questions and answer them using what we've learned throughout the course. Easy eh? Well, I've always had the tendency of digging my own grave everytime I get awesome cool assignments. But I'm not saying I did not do it :). Have a look-see at this notebook and consider the possibilities you can grasp after you've completed the course. And that's just my work...I'm a standard C-grade student.
And the exciting latest news from Jovian.ai is that they have upcoming course at Jovian for Deep Learning called Deep Learning with PyTorch: Zero to GANS! That's actually yesterday's news since they organized it earlier this year...so yeah...this is an impending second cohort! Tentatively, the course will start on Nov 14th. Click the link below to sign-up and get ready to attack the nitty-gritty. Don't say I didn't warn ya.
And that's me, reporting live from the confinement of COVID pandemic somewhere in a developing country at Southeast Asia....
I'm hitting the backed-up reading list that I've accumulated in my Zotero. It's annoying and you procrastinate the task of reading as much as possible when you're in that potato phase. I am demotivated, bored, constantly tired, and feel like devoting myself to reading storybooks for life. If I can get paid for all the hours I sleep every time I feel like signing out from life, I could be making a decent living. But, too bad, I don't.
I do not endorse any products or review anything since I feel like, to each, your own. So, I'm not going to tell you what works best or how some tips can magically fix your life. I am lucky that I have an incredible academic supervisor, a flexible boss at work, a very academic-oriented sibling, and a supportive squad of friends. Even with all that, I am still depressed. So, if you're down on the low at the moment, you're not alone. But when you have made a promise, you will look like a total flake if you don't deliver. So, you gotta move your ass anyway, right?
I just started reading papers again and it was so hard. Two weeks go by without me making any progress...just stuck on one paper and not retaining a single piece of information at all. All that forehead and nothing...nothing sticks. So you can say that I am hating life right now. But, today...I manage to reach some sort of compromise with myself and it starts to feel good. So, I would like to share it with you guys who could be struggling to get the engine started as well.
🎯 Literature Review Catalog
My supervisor is an awesome human being. He's the manager/cheerleader/mentor/Allfather/Captain America/Britney Spears to my lackluster academic history. He had been keeping tabs on me despite my intermittent anxious mood that swings like a freaking metronome, so you can say that he practically keeps my boat afloat at this unprecedented time. For our proposal writing (there's a whole army of us that he's supervising), he shared something valuable. The 'Literature Review Catalog'.
Yes. It's an Excel Sheet. Nothing fancy with very normal columns that indicates the papers/resource you've read. Looks simple and useful. The columns are populated as follows:
Year: The year of publication.
Author: Short author list.
Country (Study Area): The areas that are being studied in this research. If you're an Earth Science student like me, you can narrow it down to countries. But I think overall, countries are the most general part of discriminating different studies.
Main Keyword: I create my own keywords to develop my own system of comprehension. But I do create a column for the keywords found in the paper itself.
Issue & Objectives: You can find this information from the Abstract and Introduction part of the paper.
Proposed Method: This can be found in the Results section but I usually scan through the Methodology to add in more information when I do second round scanning of the paper.
Findings & Conclusions: I add in more notes on information that is new to me here in addition to the conclusion. New information can be extracted when you do another once-over of the paper and a conclusion can be obtained from the Conclusion section.
Reference: You can find references that are relevant to your studies from this paper! So why not? Right?
But, it's the laborious work that comes with it that turns my stomach. It scares the hell out of me despite any motivational speech I give myself. But it can all make sense when you pair it with the following method 👇🏻👇🏻👇🏻
🎯 How To Read A Paper Quickly & Effectively | Easy Research Reading Technique
This is the gem my sister told me about yesterday. I brushed it off since it stresses me out to see people sharing their speed-reading techniques, study tips, and how to ace all the subjects in the world or how to get a 4.0 GPA. It really isn't the good people's fault and I blame it on my constantly anxious self. I don't even know what's wrong with me, so...it's not them. It's me. But, here, we're gonna work on 'me'. So, give this 10 minutes video a watch. It's worth it because Dr. Amina Yonis really knows what she's talking about and what's even better, she really is an advocate for effective reading/studying. It's short enough for you to maintain your attention span and you will learn how to actually 'evaluate' your reading materials; are they worth the second shot at reading? Is there any added value to it?
To summarize, what you should look out for:
Title: Read the title and find the keywords
Abstract: Lookout for the results and methods in a simple sentence
Introduction: Read the first and last paragraphs. Most of the time, the first paragraph highlights the satellite view of the crisis and the last paragraph zooms straight for the objective.
Results: Pay attention to the headings since that more or less highlights what was it that they find. If there aren't any headings, try looking at them by paragraph. Scan them through.
Conclusion: This summarizes everything in the research paper.
After the 'Conclusion', you may feel like it is an info/findings that you've already expected or grasped, and you may just proceed and read other new ones in your pile. But if you need to dive deeper, jump to the 'Results' again for the key figures or results and limitations.
So ...
How do you go about reading this and what has it got to do with the 'Literature Review Catalog'? Well, using this efficient reading method and taking out the notes into the columns will help you condense all the important information and helps you stop re-reading constantly the details that are not paramount to your study.
🎯 Forest App
To amp up and see if it was effective, I actually timed myself with the 'Forest App'. I have been estranged from it since my potato phase, but now, it's back to being that BFF I need. It took 10 minutes to go through all the steps and if the paper isn't heavy-laden, 5 minutes to fill it into the 'Literature Review Catalog'. I manage to think and ask questions in my head as I fill in the columns and I believe that's the most important part of the effective reading that we need as someone who's jumping into a very dynamic environment of scrutinizing existing work. You can use any sort of timer to actually give a sense of urgency to your work - it does help to a certain extent. So, if you intend to have fun creating a forest of pretty trees while making good of your focus time, check out this video!
🎯 Reference Manager
And please please please, organize/record your references responsibly using reference management software. Some swears by Mendeley, or the good ol' EndNote. There's also Flowcite and Citationsy. Use them. Don't download those papers indiscriminately without recording the details that can help you sync them straight to your word processor using viable plugins. I personally use Zotero. It comes with a Chrome plugin and Microsoft Word plugin that you can download separately. It's compatible with Linux and iOS operating system. I used to park my work at Mendeley, but I find Zotero more powerful and flexible enough to use and it actually helps me to make the effort to remember what I actually downloaded rather than rely on the convenience of going back and forth to cloud storage. And it's open-source. So, try it out to create an organized library.
To all the aspiring scholars out there, when you win, we all win. Share your phase and troubles with the #studyblr or here with me. Emotional support is important and if the internet does not give you peace of mind, sign out and unplug. It's ok. When you're ready to work, reach out to anyone you think will respond positively and want to help you succeed. We can't all do things alone. So, start that power-up playlist and start working!
With this, I am commencing my submission for the #30DayMapChallenge for 2023 🗺
The categories outlined is similar to that of last year but I am never going to hate this repetition. How can I? It's a basics of making maps and there's so much to learn from the single-word theme.
Any aspiring map-makers out there? Let's share our maps for this wonderful month of November under the #30DayMapChallenge 2023!
There is a moment where base maps just couldn't or wouldn't cut it. And DEMs are not helping. The beautiful hillshade raster generated from the hillshade tool can't help it if the DEM isn't as crisp as you would want it to be. And to think that I've been hiding into hermitage to learn how to 'soften' and cook visual 'occlusion' to make maps look seamlessly smooth. Cartographers are the MUAs of the satellite image community.
I have always loved monochromatic maps where the visual is clean, the colors not harsh and easy for me to read. There was not much gig lately at work where map-making is concerned. The last one was back in April for some of our new strategy plans. So, when my pal wanted me to just 'edit' some maps she wanted to use, I can't stop myself with just changing the base map.
The result isn't as much as I'd like it to be but then, we are catering the population that actually uses this map. Inspired by the beautiful map produced by John M Nelson that he graciously presented at 2019 NACIS; An Absurdly Tall Hiking Map of the Appalachian Trail. What I found is absurd is how little views this presentation have. The simplicity of the map is personally spot-on for me. Similar to Daniel P. Huffman as he confessed in his NACIS 2018 talk; Mapping in Monochrome, I am in favor of monochromatic color scheme. I absolutely loathe chaotic map that looked like my niece's unicorn just barf the 70s color deco all across the screen. Maybe for practical purposes of differentiating values of an attribute is deemed justifiable but surely...we can do better than clashing orange, purple and green together, no?
So...a request to change some labels turn into a full-on make over. There are some things that I realized while making this map using ArcGIS Pro that I believe any ArcGIS Pro noob should know:
Sizing your symbols in Symbology should ideally be done in the Layout view. Trust me. It'll save you alot of time.
When making outlines of anything at all, consider using a tone or two lighter than the darkest of colors and make the line thinner than 1 pt.
Halo do matter for your labels or any textual elements of your map.
Sometimes, making borders for your map is justifiable goose chase. You don't particularly need it. Especially if the map is something you are going to compact together with articles or to be apart of a book etc.
Using blue all the way might have been something I preferred but they have the different zonations for the rivers, so that plan went out the window.
And speaking of window...the window for improvement in this map is as big as US and Europe combined.
In ArcGIS Pro, the Erase tool only comes with the Advanced license. There are other ways to go about removing parts of a polygon/line data layer like the Clip tool. But Union is that tool where it makes more sense by principle.
It works by marking overlapping parts of two different data layer with integers; 1, 2 and so forth. Those that do not overlap is universally -1. So, remove everything else that you want out of the picture by deleting output features that contain FID integer values of more than -1! Simple eh?
Check out the <3 minutes demo below!
P/S: Happy New Year peeps! ♥
Tool: ArcGIS Pro 2.9.3 Technique: Overlay analysis, visualization via remote sensing technique
These maps are developed to aid or supplement the Natural Capital Valuation (NatCap) initiative. As cited by WWF:
An essential element of the Natural Capital Project is developing tools that help decision makers protect biodiversity and ecosystem services.
One of the site included in this initiative by WWF-Malaysia is the Heart of Borneo (HoB). Specifically for this exercise, the visualization of policy and land use eventually become the data input utilized in the tool InVest that generates the models and maps for the economic values of ecosystem services within the landscape of interest.
The generation of the data mainly includes superficial remote sensing to assess the status of the land use in the respective concessions using Sentinel-2 satellite image with specific band combination to identify tree cover, particularly mangrove forest.
Tool: Operations Dashboard ArcGIS, Survey123 for ArcGIS, ArcGIS Online Technique: XLSForm programming, web application development
The northern highland communities of Lun Bawang have been collaborating with WWF-Malaysia under the Sarawak Conservation Programme (SCP) to empower sustainable economies and managing their natural biodiversity through the Community Empowerment Strategy (formerly known as Community Engagement and Education Strategy).
Since 2016, the communities have been actively mapping out their land uses and culturally important locations to delineate their areas of settlement and source of livelihood. Given the close vicinity of their communities to the licensed timber concessions, producing a definitive map is important to preserve and conserve their surrounding natural capitals.
Several outreach has been done and the community mapping effort has been shifted to implement citizen science via the Survey123 for ArcGIS mobile application which is apart of the ArcGIS ecosystem. This enables the local community to collect information despite the lack of network reception and the data can still be synchronized upon availability automatically or manually shared with the field officers.
📌 Availability: Retracted in 2021