TOP 10 AR SDK for iOS & Android Development in 2022

I make the TOP-list of iOS and Android Augmented Reality SDK intended to shape your AR development (Free and Paid).

I make the TOP-list of iOS and Android Augmented Reality SDK intended to shape your AR development.

In 2022, revenue from AR mobile applications amounted to 15 497 million U.S. dollars worldwide.

statista.com

First, looking Wiki-Article: Augmented reality – Wikipedia
With the help of advanced AR technology (e.g. adding computer vision and object recognition) the information about the…en.wikipedia.org

SDK (software development kit or devkit) is typically a set of software development tools that allows the creation of applications for a certain software package, software framework, hardware platform, computer system, video game console, operating system, or similar development platform.

This AR SDK for iOS and Android that you can start using right NOW!

Free SDKPaid SDK
ARToolKit
ARKit
Flutter
EasyAR
Xzimg
NyARToolkit
Kudan
MAXST
Wikitude
Vuforia
XZIGM
AR-media


Free Augmented reality SDK:

Flutter – Beautiful native apps in record time

Flutter 1.7 is new Google’s mobile UI framework for crafting high-quality native experiences on iOS and Android in record time. Flutter works with existing code, is used by developers and organizations around the world, and is free and open source.

Who’s using Flutter?


ARToolKit

Available on: Android, iOS & Mac, Windows, Linux

ARToolKit is an open-source computer tracking library for creation of strong augmented reality applications that overlay virtual imagery on the real world.

License‎: ‎GNU Lesser General Public License
Original author(s)‎: ‎Hirokazu Kato


ARKit by Apple

Introducing Augmented Reality development for iOS, one of the biggest mobile platforms of today. ARKit is an SDK for software developers to create augmented reality apps and games for iPhones and iPads.

Supported platforms: iOS 11.0+


EasyAR

Available on: Android, iOS & Mac, Windows

Apps powered by EasyAR SDK 3.0


Xzimg

Supported platforms: PC, Android, iOS, Windows, WebGL.


NyARToolkit

Available on: Android, iOS.

NyARToolkit for Processing is provided by LGPLv3.

Latest commit on Sep 21, 2017

NyARToolkit project is developing a vision-based AR library that based on ARToolKit. Current NyARToolkit libraries have ARToolKit Professional (ARToolKit5) APIs.


Kudan

Available on: Android, iOS & Unity

According to reviews and comparisons of efficiency, Kudan is the main rival of Vuforia and make augmented reality development very easy.


MAXST

Available on: Android, iOS & Windows, Mac

MAXST AR SDK Is the Easiest Way to Develop an AR App. Learn More. AR MANUAL. The AR Manual Is a New-concept User’s Manual at Your Fingertips.


Paid augmented reality SDK

Wikitude

Available on: Android, iOS & Unity, Smart glasses Wikitude – Get Started With The World’s Leading Cross-Platform AR SDK
An Augmented Reality engine that empowers your iOS, Android & Smart Glasses apps with Image & Object Tracking, Instant…www.wikitude.com

Supported development frameworks: Native API, JavaScript API, Unity3D, Xamarin, Titanium, Cordova.

Wikitude is a mobile augmented reality technology provider based in Salzburg, Austria. An Augmented Reality engine that empowers your iOS, Android & Smart Glasses apps with Image & Object Tracking, Instant tracking (SLAM), Geo AR.
Founded in 2008.
Developer(s)‎: ‎Wikitude GmbH

 
Free Startup Pro Pro3D Demo  Enterprise
Geo Geo Geo Geo Geo
2D Im.Recognition 2D Im.Recognition 2D Im.Recognition 2D Im.Recognition 2D Im.Recognition
3D Image Recognition 3D 3D 3D
Cloud Recognition Cloud Recognition
Multiple apps
Watermark No No No No
Free €1990 per app €2490 per app €499 per app Call

Vuforia

Available on: Android, iOS & Unity, Windows

Supported platforms: Android, iOS, UWP and Unity Editor.

Vuforia is the leading AR platform. Paid from $99/mo

Read also: Vuforia Developer library — wiki.

XZIGM

Available on: Android, iOS & Windows, WebGL

XZIMG Face Tracking engine is a product developed by XZIMG Research that addresses the Augmented Reality market.

AR-media

Available on: Android, iOS & Unity


Zappar: Augmented, Virtual & Mixed Reality Solutions

Augmented Reality adds a new feature to this evolutionary appendage totally reimagining the role of the camera on your phone when used in an app as a new remote control for the world connecting you (and your audience if you’re a business owner) to the things and places around you.

For iOS or Android or an embed component in your existing app or indeed a new standalone app

There are lots of solutions allowing you to start developing your Augmented Reality App right away.

Note: Description for SDK has been copied from their respective official website.

New Translator in iOS 14-15: What’s so great about it?

WWDC 2020 took place a month ago but the whole world is still discussing the innovations presented by Apple. Needless to say that thousands of iPhone users can’t wait when they will be able to make use of the brand new iOS 14 features. Apart from being so visually appealing, the new version of iOS will be full of numerous functionalities and among them a built-in Translator. 

But what’s so revolutionary about it? Should we all really feel excited or is it just a minor update of the existing functionality? Let’s take a closer look at Apple’s Translate and discover what it is capable of. 

What advantages does the Translate have? 

First of all it is necessary to mention that Translate is going to be a standalone application that all iPhones will get after their iOS will be updated to the 14th version. Translate app is absolutely user friendly and it does not require any specific actions. You simply open it, select the desired languages (the one you are going to translate from and the one in which everything is going to be translated to), type your sentences and get a result in a couple in a second. 

Understanding the work of the built-in translator on the iphone

What’s especially great about the Translate app is that it will support dictation. So you don’t even need to waste time on typing the text and you can use an app on the go. You can easily switch between modes – written and conversational – by changing the rotation of your device. Vertical rotation is used for typing your text, while in landscape one the microphone button will appear and you can start dictating the text. 

There is also an option allowing you to listen to the translated piece of text and it is going to be read with the correct pronunciation. Taking into account this feature and the number of supported languages (there are 11 of them), it can be said that iOS Translator will be perfect for learning languages and communication with foreigners while being on vacation. 

Source: Apple 

The first question every Apple fan had when he heard about the Translate app is whether it is going to require Internet connection. Yes, it will but it also has an offline mode. Which makes it even more convenient if you are a traveller or a freelancer, because sometimes the Internet connection abroad can fail you and you end up being without any smart helpers. Now at least you will have a great app helping you to start a conversation with another person or to translate something quickly (like directions, information on billboards, etc.).

If you are interested how the Translator manages to work fine without the Internet, well that’s thanks to Neural Engine. As to the online mode, the application uses Apple servers and the translation itself is more accurate than in offline mode. However, the testers of the beta version of iOS 14 state that even in offline mode the Translate app provides decent and understandable results. 

Speaking about Neural Engine, it will also improve the work of Siri. Now you can ask it to translate something for you even when you are not connected to the Internet. Safari will also get a built-in Translator and will be able to translate the content of pages on the go without connecting to any third-party services. 

Those who learn languages know the struggle when you are trying to memorize a phrase or a word and somehow after an hour or so it slips your mind. With the Translate app installed on your iPhone you will forget about this issue, because it has Favourites section and keeps history of recent translations. So whatever piece of text you were translating, you can bookmark it and get back to it later. Such a handy feature, we think iPhone users from all over the world will be very thankful for it. 

We can say for sure that the addition of the Translate app to iOS devices will push numerous app owners to consider updating their solutions especially if they offer some translating features or if they are related to e-learning. Not only they will have to consider adding offline mode to their solutions to keep them competitive, but also they will need to think about widgets for their apps. 

Let’s not forget that iOS 14 will bring numerous new features to our devices and the changes always imply improvements. Widgets will be new go-to features, so sooner or later every app owner will need to incorporate widgets in different sizes providing different amounts of info related to the app on the Home Screen. 

New Translator may also boost popularity of apps related to notes and reminders creation. Together with the new Scribble feature it can revolutionize the way we create and use notes. Not only the text can be written by hand using Apple Pencil but also it can be easily translated into any of the supported languages by built-in Translate app. 

Should I add some new features to my translation app?

If you have an iOS app that is used for translation, we would recommend you to consider updating it and to do its code review first. This will help to get your app prepared for iOS update and make sure that it will work fine on it. You may also need to create widgets or app clip because these features will be trendy.

Is offline mode necessary in translation related apps?

Yes, offline mode is preferable because it helps to make an app more convenient to use. Not always people have a good Intenet connection, especially if they are traveling and need to use your app to translate things on the go.

Is Apple planning to add support of more languages to Translate app?

Apple has not announced this yet, so as of now there will be 11 languages available in Translate app. But probably Apple will extnd this list with time.

To sum it up

Apple is doing its best to create effective and seamless user experience and that is exactly why we like its technology inventions. With the addition of Translate app, every iPhone user will be able to overcome language barriers and feel more comfortable in a foreign environment. Presence of offline mode, option allowing to bookmark translated pieces and save recent translations, supported dictation – all these options will make learning of languages and communication between people easier. 

It cannot be said that there definitely will be a boom of new applications but we predict that after iOS 14 release the reviewing of app code will become quite common. Why not improve your solution and make it more advanced if you have a chance to incorporate up-to-date features? After all, 2020 is all about new user experience. 

Apple iOS translate app review

The iOS translate app is a lifesaver for those who frequently travel. The app has more than 50 languages and can translate text in real-time by using the camera, voice, or text. It can also translate to and from 120 different currencies. And best of all, it’s free!

LiDAR or not? ARKit 4 + iOS 14

Earlier this year, Apple launched the 2020 iPad Pro with a transformative technology: LiDAR. They simultaneously released ARKit 3.5, which added LiDAR support to iOS. Just 3 months later, alongside the reveal of iOS 14, they unveiled ARKit 4, which showed a massive leap forward in LiDAR capabilities throughout ARKit and cemented Apple’s commitment to the technology. Now, with rumors of a new LiDAR-enabled iPhone 12 around the corner, you may be wondering: should I care?

This article will answer that exact question by evaluating the use cases of LiDAR and comparing its effectiveness to the deep learning/computer vision models ever-present in our portable devices today.

Background

Hardware Used

  • iPad Pro 12.9 4th gen [LiDAR, A12Z CPU with Neural Engine][1]
  • iPad Mini 5th gen [No LiDAR, A12 CPU with Neural Engine][2]

These two devices were great comparison devices because they both have the same CPU architecture and neural engine, which should result in machine learning models running nearly identically on the two devices. The only difference for non-CPU-demanding AR applications should be the LiDAR sensor.

LiDAR Data Available

In ARKit 3.5, Apple augmented existing AR features like Raycasting, Motion Capture, and People Occlusion with LiDAR data to improve their effectiveness. They also provided a couple of brand new use cases of LiDAR such as illuminating real-world surfaces with virtual lighting and allowing virtual objects to collide with real-world surfaces [3].

ARKit 4, in addition to some more improvements on those features, Apple introduced the Scene geometry API, which returns a mesh of the surrounding environment, and the Depth API, which provides the depth, in meters from the camera, of each pixel on each frame at 60 fps, along with confidences for each of these values [4].

Now, let’s see if this data is any good!

Comparison

Instant AR Placement

Virtual object placement works in iOS by first identifying the properties of a surface, e.g. the plane vectors, and then placing the virtual object on it. Surfaces could be horizontal, vertical, a face, an image (i.e. a book or poster), or an object. You could explore these surface options, along with ARKit in general, in Apple’s Reality Composer [5].

On iOS devices without a LiDAR sensor, prior to placing an object on a surface, the user is prompted to move around a bit to help the machine learning models get a better understanding of the surfaces. This makes for a slow and annoying experience for users when trying to place an object on a surface. This movement is required because the cameras need to get an accurate perception of depth before being able to place a virtual object in the real world accurately and having various camera angles of the room allows the machine learning models to do a better job [6]. This process is called onboarding, and it’s something LiDAR improves upon significantly.

With LiDAR, these depth estimates happen almost instantly, allowing users to place objects as soon as a plane is visible. This could be tested by using ARKit 4’s object placement, which will automatically use LiDAR if the device supports it. In low light, the effect is especially pronounced. Below are two videos I shot running Apple’s Measure app on the iPad in a low light environment.

LiDAR-enabled iPad: https://youtu.be/6qDCBBi4NV8

Non-LiDAR iPad: https://youtu.be/rGhIlLSaKOk

As shown in the videos, the LiDAR experience is a lot simpler for the user and allows for object placement in adverse conditions such as low light.

People Occlusion

Apple unveiled People Occlusion [7] in ARKit 3/iOS 13, which allowed AR objects to be placed and rendered correctly as people passed in front of the object in the camera frame. You could see its effects in the comparison below:

Without People Occlusion, objects were placed directly on top of people in the frame. Image source.
With People Occlusion, the person correctly covers up the object. Image Source.

Now how does this work? Apple explains the magic in the ARKit 3 Introduction Video [8]: 1) a machine learning model figures out which pixels are people. 2) a second machine learning model estimates the depth of each person, determining whether it should be covering the AR object or covered by the AR object. 3) using this data, the layers are drawn in the correct order.

Analyzing step 2 however, it’s quickly apparent that a) this is quite a difficult task for a machine learning model and b) this all or nothing approach to determine if a person is in front of the object doesn’t work in cases where the half of the person is behind the object and the other half isn’t. Running ARKit 3 on this situation yielded:

Without LiDAR, the ML models are having trouble occluding the object correctly

Instead of estimating the distance using machine learning, the machine learning model in step 2 of the People Occlusion section above could be replaced with a LiDAR sensor, which will actually tell us how far each pixel is from the camera. This data, which comes in the form of a depth map (explored below), could be used to determine exactly which parts of the camera image should be in front/behind the AR object. This is exactly what Apple did in ARKit 4, enabling much more accurate People Occlusion compared to just a camera. Or so Apple tells us. From my testing, Apple’s current implementation doesn’t work in many circumstances, and about half the time, I got results like below, even with the $1000 iPad Pro and its LiDAR sensor:

Overall, this means that while LiDAR does improve on camera-only depth perception, either it’s not perfect or Apple’s software still needs improvements to get this exactly right.

Occlusion behind real-world objects

As explained in the people occlusion section above, non-LiDAR devices achieve the occlusion effect via a machine learning model that identifies people and then estimates their distance from the camera. LiDAR devices, on the other hand, don’t need the knowledge that they are people at all, they just know how far each pixel is from the camera (as explained in the Background section), and on a per-pixel level can determine if it should be occluded by, or occlude the virtual object.

This much more fine-grained level of detail allows LiDAR-equipped devices to have virtual objects be occluded by any real-world surfaces or objects and also easily handle cases where you see only certain parts of the virtual object. This works well in many cases. However, I was able to reproduce very obvious flaws with the implementation quite easily. For example, here’s a virtual vase cleary behind an object:

And here’s the iPad not occluding the object at all, even though it should.

Clearly there is still work to do.

Cost

LiDAR sensors are expensive. So, the tech is only currently available in the 2020 iPad Pros, which start at $800 USD and only increase in price if you want a larger screen or more than 128 GB of storage. This means that LiDAR is less likely to up in lower-tier phones and in turn, many, if not most people won’t be carrying around a LiDAR device for the foreseeable future.

This means that if you are developing something using some of the LiDAR specific features mentioned in the new features section below, expect only a fraction of the iOS device ecosystem to be able to use it.

However, on the flip side, now that LiDAR devices are coming, if you decide to build things with the rest of ARKit (all of the features mentioned above), then not only will these features run on the majority of iOS devices, but they’ll also get better over time for your users as they migrate to LiDAR-enabled devices, all without any extra code on your part. This makes AR development all the more attractive right now!

Other Considerations

Deep learning models for computer vision are getting better every day [9]. Thousands, if not millions of developers around the world are working on improving them. As this technology improves and Apple adopts new methodologies and technologies, you’ll start to see the LiDAR-less AR experience improve on all supported devices. This means that while LiDAR has a lead on distance perception right now, the lead is getting smaller every day and one day, maybe it might be just as accurate (if not more so). Elon Musk sure thinks so.

Then, the time spent on building LiDAR-specific experiences may actually be an investment for the short-to-medium term, just for while computer vision plays catch up. Granted, this quick improvement of computer vision is just a speculation, that even if it happens, will take several years at the very least.

Another limitation is the 5m range of the LiDAR sensors in the iPad Pros. This means that a) your experiences can’t place objects (or use the LiDAR sensor in any way) greater than 5 meters away with from the user and b) even for ranges within the 5m, LiDAR points are quite sparse as you get further from the device, which prevents very precise data from being available at further distances. For example, in the Depth Map section below, I show a diagram of a point cloud which illustrates the lack of density problem when the points of interest are 2 meters from the user.

New Features Enabled by LiDAR

Scene Geometry

Lets a user create a topological map of their space with labels identifying floors, walls, etc. Companies like canvas.io [10] are already using this feature to enable mapping of rooms/buildings using just the LiDAR scanner on the 2020 iPad Pro.

A 3-D room scan from canvas.io

Canvas.io claims that by scanning your house with the LiDAR scanner, the output 3d model will have an error of only up to 1–2%! This is a far cry more accurate than the measurements taken from non-LiDAR devices. I tried using Apple’s Measure app to “measure” the length of a tape measure and LiDAR scans proved to be several times more accurate in darker conditions. However, in well-lit conditions, camera-only measuring worked almost flawlessly as well (down to the centimeter, at least).

iPad mini 5 getting the measurement exactly right in good lighting
iPad mini 5 struggling to measure accurately in low-light
iPad Pro 2020 measuring relatively accurately in low-light

Note: even though it looks like the picture is darker in the iPad Mini shot, the two pictures were taken in the exact same light. This may have to do with the cameras on the iPad Pro being better at brightening up low-light images.

Depth Map

ARKit 4 gives applications access to a depth map on every AR frame. The depth map provides a mapping from each pixel in the AR frame to the distance from the camera that pixel is. You could feed this depth map, in combination with data from the camera about angles and colors into a Metal vertex shader that unprojects and colours the individual LiDAR points onto the 3D space. Then, by saving the points of each frame onto a persistent 3D state, we end up with a point cloud, which could then be rendered using Metal. Here’s the result of running this code on a 2020 iPad Pro:

Point cloud of my desk setup from running this code on the iPad Pro 2020

The varying densities of points are caused by how close I got to the cluster of points with the LiDAR sensor. Looking at the top left of the image, you could see that the points aren’t very dense, since I didn’t get very close to them (about 2 meters away). On the other hand, looking near the center of the image, you could almost see individual keyboard keys, along with the backlighting of my laptop’s keyboard. To achieve this precision I passed over my laptop with the LiDAR sensor at a 10 cm distance from my laptop (any close distance should achieve similar results).

Other Features

In addition to Scene Geometry and the Depth API, LiDAR also enables a couple of other features such as allowing virtual light sources to illuminate physical surfaces. This allows for a far more realistic experience in AR apps. Further, the precise scene geometry allows virtual objects to collide with real-world objects and surfaces, enabling many complex experiences mixing the real and physical worlds [3].

Summary

As shown in the comparison and discussion of new features, LiDAR brings many useful capabilities that could enable experiences that weren’t possible before, along with improving already existing experiences. The instant object placement and vastly superior occlusion are just two examples that show LiDARs clear advantages.

On the flip side, the added cost of LiDAR sensors was also discussed, which limits the possibility of mass adoption at every price point. Further, we talked about the ever-improving computer vision capabilities of our phone’s cameras and how they may catch up to LiDAR’s depth estimation eventually.

Then, let’s break this down into two kinds of experiences:

  1. For experiences that benefit from LiDAR, but down require much additional development effort (such as object placement and object occlusion), it’s definitely worth adding the configuration to your app to make the experience significantly better for your LiDAR-enabled users.
  2. For experiences that only support LiDAR, such as building a 3-D model of your house using the Depth API, you should consider that the adoption of LiDAR might not be super widespread and it may be limited to only expensive “Pro” models of iPhones and iPads which usually aren’t the most popular models. Further, many of these LiDAR features aren’t fully baked yet, so you can’t depend on them working perfectly. If these problems are okay, then LiDAR might enable you to build something that wasn’t possible before!

Overall, LiDAR is an extremely powerful technology and I hope you use it to build something awesome!

References

[1] https://www.apple.com/ipad-pro/

[2] https://www.apple.com/ipad-mini/

[3] https://developer.apple.com/news/?id=03242020a

[4] https://developer.apple.com/news/?id=1u5zg8ak

[5] https://developer.apple.com/augmented-reality/reality-composer/

[6] https://developer.apple.com/design/human-interface-guidelines/ios/system-capabilities/augmented-reality/

[7] https://developer.apple.com/documentation/arkit/occluding_virtual_content_with_people

[8] https://developer.apple.com/videos/play/wwdc2019/604/

[9] https://medium.com/@ODSC/the-most-influential-deep-learning-research-of-2019-21936596fb4d

[10] https://canvas.io/

Best streaming apps for iPhone

It has been a year since the first news of a global pandemic started hitting us on a day-to-day basis. Our entire lifestyle has undergone a major shift and everyone across the globe has been forced to either stay indoors or stay away from other people for prolonged durations. With the lockdown and work from home norms underway in many regions, technology, especially the internet has proven to be a solace for billions of people across the world. Video on demand (VOD) or the over-the-top (OTT) content providers have seen a massive surge in viewership. With high-speed internet penetration reaching to more and more small city and towns, iPhone users got a chance to take advantage of their beautiful screens and watch immersive content from every corner of their house. 

Online video streaming has gained so much popularity because of the pandemic, that some streaming apps had to put a cap on the quality of the stream in Europe, India, the US, basically in most of the high usage regions across the world. This is where the iPhone users have an advantage since the software and hardware of the device works extra hard to provide the best user experience all the time. Some of the best online video streaming apps for your iPhone have been listed below so you never run out of content to watch in times of boredom.

  1. Popcornflix – this one is a 100% legal and subscription-free app that users can take advantage of when you do not want to pay to watch some content. iPhone users can watch feature-length movies and TV Shows with starring many of the biggest names in box office. It enables you to search your favorite content based on the genre and stream as much as you want without paying a dime!
  2. Pluto TV – this one again is a content provider that charges no fee for you to watch any content in its library. You can watch thousands of movies and more than 250 channels on your iPhone from anywhere in the world whenever you want without worrying about shelling out some cash. Whether it is news, sports, TV Shows and much more, Pluto TV offers something for every user.
  3. IMdB TV – IMdB is the world’s most trusted source of information about all things cinema. With IMdB TV you can get access to selected movies, favourite TV Shows, behind the scenes exclusives, trailers and IMdB original content. If you wish to stay up-to-date with the entertainment news, awards, events, etc. this one app should be on top your list.
  4. Plex – This one too gives you access to thousands of shows and movies that includes documentaries, docuseries, Bollywood musicals and much more! What is exciting about Plex is that with their media server system on your home computer, you can create your own video, photo and music library and stream it to any of the devices you have including your iPhone, anywhere in the world! You are also able to record broadcast TV using Plex.
  5. Apart from the apps mentioned above, the most famous online video streaming apps like Netflix, Amazon Prime Video, YouTube, Hulu, HBO Max, Disney+, etc. also offer a host of video content that can last a lifetime.

Because of the gaining popularity of streaming services, live streaming platform is also gaining a lot of popularity. Living in the pandemic and navigating different restrictions, live streaming your content has also gained a lot of traction in the last year. Whether it is an online party, taking online classes and even wedding ceremony, Live streaming has taken the world by a storm. With iPhone getting ubiquitous, there are a host of apps that provide you a live streaming platform so you can connect with your friends, family or fans anywhere in the world. Some better apps that you can use are mentioned ahead for you to chose from.

  1. Facebook Live – The largest and the most used live content streaming app Facebook allows you to live stream to all your friends and family wherever you are. It’s fast, it’s easy and it is interactive enough to keep you and your viewers engaged.
  2. Instagram– This one too allows you to reach out to your viewers from your home, from the beach, from your own wedding, from under the stars or just about anywhere you have an internet connection, including underwater since most of the iPhones are waterproof now!
  3. Livestream – This one as the name suggests enables professionals and amateur users alike to broadcast their event to the world. It is the market leader in this space as it provides you the ability to choose from all the live streaming happening or planned in the future for you to access. 
  4. Periscope – this is an app that provides one of the most user-friendly live streaming platforms in your palm. Apart from the usual shenanigans, this app enables you to scan your vicinity to find live streamers nearby.

Twitch, Streamago, Alively and Broadcast me are other apps that you can explore is you want a live streaming platform to reach out to the world through your content.

Directory Listing Website and Mobile App Development

I have a project idea that I want to develop as a website and an app (iOS and Android).

Useful Directory Listing Websites and Mobile Apps

The idea is to design and develop a business directory website and mobile application (Android and Apple) for the Directory Listing.

Required Skills: design, email, Swedish translation, mobile development, Microsoft access, games, General / Other Apps & Mobile, Mobile Device Platforms, Responsive & Hybrid

Fixed Price Budget: $5k – $10k

What is the difference between directory listing websites and classified websites?