Developer Insights
Join millions of viewers! Our engineers craft human-written articles solving real-world problems weekly. Enjoy fresh technical content and numerous interviews featuring modern web advancements with industry leaders and open-source authors.
Understanding Effects In SolidJS
In SolidJS, effects are a fundamental concept that helps developers manage side effects and reactive dependencies within their applications....
Aug 23, 2024
3 mins
How to set up local cloud environment with LocalStack
Most applications are powered by AWS services due to their richness in providing services for specific purposes. However, testing an AWS application while developing without maintaining a dev account is hard....
Jun 19, 2024
6 mins
How to test React custom hooks and components with Vitest
In this guide, we'll navigate through the process of testing React hooks and components using Vitest—a powerful JavaScript unit testing framework. Discover how Vitest simplifies testing setups...
Mar 15, 2024
5 mins
File Based Routing with Expo Router
Learn about Expo Router, a file-based solution for React Native and web apps, enabling seamless navigation across screens using uniform components on Android, iOS, and web....
Jan 10, 2024
8 mins
How to Handle Async Data Fetching Using CreateResource in SolidJS
In this article, you will learn how to handle async data fetching using createResource in SolidJS. The article assumes that you have a basic understanding of SolidJS. If you need a refresher, I recommend looking at the official SolidJS website to get started with SolidJS. You can also visit solidjs.framework.dev/ for a list of SolidJS libraries and resources. What is createResource? createResource is a special Signal designed specifically to handle async data fetching. Signals are the cornerstone of reactivity in SolidJS. They contain values that change over time, but unlike typical values, Signals are also event emitters. When the data kept in the Signal is changed, the Signal emits events in the system so things that depend on the Signal value can be notified. Signals are similar to “observables'', “streams”, or “subjects” in other reactive programming technologies. For more information on Signals, check out the SolidJS Documentation. The purpose of createResource is to wrap async values (any value that returns as a promise) in a way that makes them easy to interact with in Solid's execution model. This is useful because it ensures that the async function is non-blocking. How does createResource work? For createResource to work, it needs an argument which is a function that returns a promise. It returns a data getter that wraps the value returned from the fetcher function. Easy right? The resulting Resource Signal (products) also contains reactive loading and error properties that make it easy to control the view based on the current status. Let’s see how all of these are used in an example. In this example, we created a fetcher function that makes a call to get a list of products, and this function is passed in as an argument to createResource. createResource returns a Signal which has reactive properties like loading, error, latest, etc. We use the properties to conditionally render JSX based on the current reactive state. createResources can also take in a source Signal as the first argument and the async fetcher function as the second. The source Signal is provided as the query to the async fetcher function which returns a promise. A change in the source Signal calls an internal fetch method to fetch new data based on the change. If you are familiar with React-query, source Signals are similar to query keys. Let’s write an example that fetches users’ profiles from GitHub. It takes the username as input and returns the user’s profile details. Here we are fetching a GitHub user’s info. We have an input field that receives the username and set it as a Signal. The createResource takes the source Signal as the first argument and the async fetcher function as the second argument. An internal fetch method will be called anytime the Signal value changes. So if we enter a different username into the input field, createResource is aware of that change, and fetches new data based on the new Signal value. Refetch and Mutate The second value that comes back from createResource contains a mutate method for directly updating the internal Signal, and a refetch method to reload the current query even if the source hasn't changed. Mutate is useful for "optimistic mutations" like when you want to automatically update a todo list that is fetched from the server when someone clicks the add button. Refetch is useful when you know the data returned from the server changes constantly and you would like to get the latest information. For example, if you are building an application that needs to show the current price of Bitcoin in US dollars, you will need to refetch the data because the prices fluctuate. Why is createResources special? For small examples, createResource can be as simple as just setting a Signal. However, it interacts with the broader Solid framework in different ways: When your app is server-side rendered, createResource is fetched on the server while effects and Signal setters aren't, so resources work a lot better with solid start. Any loading resources will put your app in suspense mode which can automatically provide benefits when interacting with the Solid router. Conclusion In this article, we saw how to handle async data fetching using createResource. We also explored some use cases, and when and how to use some properties and methods returned by createResource. You can find the live example for this blog post on StackBlitz. If you would like to learn more about SolidJS, check out solidjs.framework.dev for a list of libraries and resources....
Feb 27, 2023
5 mins
Hey Deno! Where is my package.json?
Disclaimer: This blog was written for Deno versions prior to 1.3.1. After 1.3.1, while Deno can handle projects that contain a package.json (to help facilitate migration), it is still recommended that you handle your dependencies as discussed in this article. Introduction Where is my package.json? That was one of my first questions when I started learning Deno. Coming from a NodeJS background, I am used to having a package manager (such as NPM) for managing dependencies, scripts and other configurations, and I’m used to using the package.json file to declare my dependencies and their versions. Deno has no concept of a package manager, as external modules are imported directly into local modules (e.g import { bold } from ‘https://deno.land/std@v0.32.0/fmt/colors.ts’). At first, this seems very convenient, but it got me wondering how I would be able to manage and update dependencies when they are imported into several different modules across a large project. And how about running scripts? Node allows you to define arbitrary scripts in the package.json that can be executed using npm run. How do we define and manage scripts in Deno? In this article, we will discuss the various ways to manage dependencies in Deno, and also how to manage all scripts needed for our project. Managing Dependencies Using deps.ts The standard practice for managing dependencies in Deno is to create a deps.ts file that imports, and then immediately re-exports all third-party code. ` In your local module, these methods, and classes can be referenced from the deps.ts file. ` You may be familiar with the concept of dev dependencies in NPM. You can define dev dependencies in Deno using a separate dev_deps.ts file, allowing for a clean separation between dev-only and production dependencies. With this approach, managing dependencies in Deno becomes much simpler. For example, to upgrade a dependency version, you make the change in the depts.ts file, and it propagates automatically to all modules in which it is referenced. When using this approach, one should consider integrity checking & lock files. This is basically Deno’s solution for avoiding production issues if the content in the remote url (e.g https://some.url/a.ts) is changed. This could lead to the production module running with different dependency code than your local module. Just like package-lock.json in NodeJS, Deno can store and check subresource integrity for modules using a small JSON file. To autogenerate a lock file, create a deno.json file at the root of your project and a deno.lock file will be autogenerated. You can also choose a different file name by updating the deno.json file like so: ` You can also disable automatically creating, and validating a lock file by specifying: ` We can manually create or update the lock file using the Deno cache command, and --lock and --lock-write flags like so: ` Then a new collaborator can clone the project on their machine and run: ` Using import_map.json Another way to manage dependencies in Deno is by using the import_map.json file. This method is useful if you want to use "bare-specifiers" (specifiers without an absolute or relative path to them, e.g import react from ‘react’). This file allows you to alias a specific import URL or file path, which can be useful if you want to use a custom alias for a dependency. To use the import_map.json file, you first need to create it in the root directory of your project. The file should contain a JSON object with a single key, "imports", which maps import aliases to fully-qualified module URLs or file paths. You can use the import_map.json file to map import paths to remote dependencies and even NPM specifiers. You can also use the import_map.json file to map aliases to local file paths. For example, if you have a local module in your project at ./src/lib/my_module.ts, you can map the import path "my_module" to this file. Here's an example of an import_map.json file: ` With this import_map.json file in place, you can now import the libraries using their aliases: ` Using the import_map.json file can be a convenient way to manage dependencies in your Deno projects, especially if you want to use custom aliases for your imports. Just be sure to include the --import-map flag when running your Deno application, like so: ` This will ensure that Deno uses the import map specified in the import_map.json file when resolving dependencies. Managing Command Scripts Like npm run, the Deno CLI also has a run command that is used to run scripts in files. Depending on the permission needed or the type of operation, there are certain flags that need to be passed to the run command. For example, if you want to run a web server that uses an env file and reads from it, your command will look like this: deno run --allow-net --allow-env --allow-read main.ts If we are reading and writing to a file, we need to add the --allow-write, or if we have an API that needs information about the operating system of the user, then we will also need to add --allow-sys. This can go on and on. We could decide to use --allow-all to accept all permissions, but this is not advisable. The good news is that we can manage all our command scripts without having to retype them every time. We can add these scripts to the deno.json file. The deno.json file is a configuration file for customizing basic deno CLI commands like fmt, lint, test etc. In addition, the file can also be used to define tasks using the "tasks" field. To define tasks, you can specify a mapping of task names to “tasks”. For example: ` You can then run these tasks using the Deno task command, and specifying the task name, like this: ` Conclusion In this article, we saw how to manage dependencies and command scripts in a Deno project. If you’re coming from a Node background, and were confused about where the package.json file was, we hope this clarified how to accomplish some of the same things using the depts.ts, dev_depts.ts, import_map.json, and deno.json files. We hope this article was helpful, and you were able to learn and be more comfortable with using Deno for your projects. If you want to learn more about Deno, check out deno.framework.dev for a list of libraries and resources. If you are looking to start a new Deno project, check out our starter kit resources at starter.dev....
Jan 3, 2023
5 mins
Let's innovate together!
We're ready to be your trusted technical partners in your digital innovation journey.
Whether it's modernization or custom software solutions, our team of experts can guide you through best practices and how to build scalable, performant software that lasts.