Apple / iOS Archives - Phunware Engage Anyone Anywhere Tue, 18 Jul 2023 18:53:02 +0000 en-US hourly 1 https://wordpress.org/?v=6.6.3 Phunware Launches “Healthy Spaces” Mobile App Update for Android on Google Play http://100.21.88.205/phunware-launches-healthy-spaces-mobile-app-update-for-android-on-google-play/ http://100.21.88.205/phunware-launches-healthy-spaces-mobile-app-update-for-android-on-google-play/#respond Tue, 22 Dec 2020 17:27:06 +0000 http://127.0.0.1/blog/phunware-ammc-launch-trump-2020-app-copy/ Phunware releases a new Android update of its Healthy Spaces mobile app so you can easily schedule gatherings and enable contactless check-in.

The post Phunware Launches “Healthy Spaces” Mobile App Update for Android on Google Play appeared first on Phunware.

]]>
Phunware announces a new release of its mobile application software, Healthy Spaces, on Google Play for Android so individuals and businesses can easily schedule gatherings and enable contactless check-in to screen attendees before or upon arrival at venues and facilities.

“People are excited to be social again and businesses need tools like Healthy Spaces that will allow them to operate in a safer and more responsible manner,” said Randall Crowder, COO of Phunware. “With this latest release, any individual or business can use Healthy Spaces to seamlessly screen people before an event, a day at work or even a well-deserved night out.”

Read the full article from Proactive

The post Phunware Launches “Healthy Spaces” Mobile App Update for Android on Google Play appeared first on Phunware.

]]>
http://100.21.88.205/phunware-launches-healthy-spaces-mobile-app-update-for-android-on-google-play/feed/ 0
Navigating Permission Changes in iOS 14 http://100.21.88.205/navigating-permission-changes-in-ios-14/ Tue, 08 Sep 2020 17:30:55 +0000 http://127.0.0.1/blog/phunware-new-big-four-customer-copy/ We break down iOS 14 upcoming changes and provides recommendations on how best to handle the new privacy-related permission prompts.

The post Navigating Permission Changes in iOS 14 appeared first on Phunware.

]]>
h2 { color: #0080ff !important; font-size: 30px !important; margin-bottom: 0.5em !important; margin-top: 1em !important; } h4 { margin-top: 2em; }

When it launches this fall, iOS 14 will bring several new permission changes for requesting access to the user’s location, advertising id, photos, and local network. This blog breaks down the upcoming changes and provides recommendations on how best to handle these new privacy-related permission prompts.

Location Permission Changes

Apple is continuing with its commitment to give app users more control over their data and privacy. Last year, with the release of iOS 13, Apple gave users the option to decide if the app should have access to their location only once, only while using the app, or always. 

This year, with the release of iOS 14, Apple will build upon that and allow users also to decide if the app should have access to their precise location or just their approximate location.

New: Precise Location Toggle

When an app requests the user’s location, the user will be presented with a permission prompt asking for location access with the same options as iOS 13: Allow Once, Allow While Using App, or Don’t Allow. 

Like with previous versions of iOS, the title of the permission prompt is controlled by Apple, but the app developer configures the subtext. The subtext is intended to provide the user with an explanation of why the app is requesting this permission. 

What’s new in iOS 14 is the user’s ability to toggle precise location on and off. The precise setting is enabled by default, which means the app will get the user’s fine location. If the user disables this, then the app will only get the user’s approximate location. In our tests, the approximate location may return location coordinates for a user up to 2 miles away. 

New: Temporary Request for Precise Location

Another change is the app’s ability to temporarily request precise location if the user previously only allowed approximate accuracy. This is a one-time permission request that, if granted, only lasts during the duration of the app session. According to Apple, “This approach to expiration allows apps to provide experiences that require full accuracy, such as fitness and navigation apps, even if the user doesn’t grant persistent access for full accuracy.”

Background Location Permission

App developers may need the user’s location in the background to support features such as Geofence notifications. Same as in iOS 13, Apple doesn’t allow this option on the first request but instead allows the app developer to request this permission at a later time. If your app requested Always Allow permission, then this prompt will be displayed automatically the next time the user launches the app, but typically not the same day the initial prompt was displayed.

Once an app has received the user’s location in the background a significant number of times, Apple will inform the user and ask them if they want to continue allowing this. This is also unchanged from iOS 13. 

New: Updated Location Settings

Users can adjust their location settings in the iOS Settings app by navigating to Privacy → Location Services → App Name.

Users will have the option to adjust their location access to Never, Ask Next Time, While Using the App, or Always.

If a user receives a location permission prompt and selects Allow Once, their location setting will be Ask Next Time, prompting them to make a selection again the next time the app requests their location.

What’s new in iOS 14 is the Precise Location toggle, which allows users to switch between precise and approximate location.

Impact

The most significant impact of these changes will be on apps that require a precise location, such as navigation apps or apps that use geofence notifications. Given that an approximate location could put the user miles away, the precise location option is required for these apps. 

As mentioned earlier, the app has the option to temporarily request precise location from a user who has previously only granted approximate location. This request can be triggered when the user begins a task that requires fine location, such as wayfinding. 

However, there isn’t an explicit user action to trigger this temporary permission request when it comes to geofence notifications, so the temporary precise location prompt won’t help us here.  

In addition, geofence notifications require the Always Allow background location selection, so apps that promote this feature will feel the impact most.

Recommendations

  • Don’t request the user’s location until you need it.
  • Include a usage description clearly explaining why you need the user’s location.
  • Don’t request Always Allow permission unless you have a feature that requires the user’s location when the app is closed or backgrounded.
  • If you require precise location, but the user has only granted approximate location, use a one-time temporary precise location request.
  • If you require Always Allow + Precise location settings for Geofences, but the user hasn’t granted this, then to increase user acceptance, include a custom alert or screen informing the user the benefit of allowing this and provide instructions on how they can change this in iOS Settings, with a button that deep links them there. 
  • Remember, if the user chooses Don’t Allow, you won’t be able to request this permission again.

IDFA Permission Changes

The IDFA, or Identifier for Advertisers, is going to change as we know it. Ad agencies have relied on this device identifier for years to track users across apps and websites to learn their habits and interests so that they can target them with relevant ads. 

This was made more difficult with the release of iOS 10 when users could enable a Limit Ad Tracking setting, which would return all zeroes for this identifier. Before that, the only thing a user could do is reset their identifier value, but this was seldom used.

New: IDFA Prompt

iOS 14 brings the strongest changes to the IDFA yet, which may effectively kill it as the primary way advertisers track users. Rather than defaulting to having the IDFA available, developers will now have to prompt the user to allow access to the IDFA. 

The wording in the permission prompt will undoubtedly lead to a majority of users declining this permission: “App would like permission to track you across apps and websites owned by other companies.“

Like the Location Permission prompt, the IDFA prompt’s title is controlled by Apple, but the app developer configures the subtext. Developers will have to come up with a usage description convincing enough to persuade users to allow themselves to be tracked.

According to Apple, “The App Tracking Transparency framework is only available in the iOS 14 SDK. This means that if you haven’t built your app against iOS 14, the IDFA will not be available and the API will return all zeros.”

However, on September 3, 2020, Apple extended the deadline to 2021, by stating, “To give developers time to make necessary changes, apps will be required to obtain permission to track users starting early next year.“

New: Updated IDFA Settings

Also new in iOS 14 is a toggle in iOS Settings that, when disabled, prevents app developers from ever prompting the user for permission to use their IDFA. A user can find this in the iOS Settings app under Privacy → Tracking and applies globally to all apps. 

Impact

The most significant impact will be on the ad industry. Without a guaranteed way of tracking users across apps and websites, advertisers will need to rely on less tracking users’ ways of tracking. Since getting the user’s IDFA was never guaranteed, advertisers already have fallbacks methods for tracking users. Such methods include fingerprinting, where a collection of other information about the user, such as IP address, device model, and rough location, is used to verify that they are the same user. Another option is to use sampling since there will still be some users who allow themselves to be tracked. For example, if 5% of tracked users installed the app through a particular install ad, one can presume that about 5% of all users can be attributed to that campaign. 

Recommendations

  • Don’t request the user’s IDFA if your use case can be satisfied with the IDFV (Identifier for Vendor) instead. The IDFV is similar to the IDFA in the sense that it’s a unique identifier for the user. However, each app developer will be assigned a different IDFV per user, so this doesn’t help in tracking users across apps and websites by other developers. Since there are no privacy concerns, there is no permission prompt needed to obtain the IDFV and the user has no way to disable this.
  • Include a usage description clearly explaining why you’d like to track the user across apps and websites
  • Consider a custom prompt in advance of the official IDFA permission prompt to provide your users with more context before the scary system prompt is presented.
  • If a user declines the IDFA permission and you need to track them outside your app, use probabilistic methods such as fingerprinting or sampling.
  • Remember that if the user chooses Ask App Not to Track or if they disable the ability to prompt for this permission in Settings, then you won’t be able to request this permission. The best you can do at that point is to detect that they declined this permission, show some custom prompt, and direct them to the Settings app to enable the permission there.

Photo Permission Changes

Apple has required users to grant permission to their cameras or photos since iOS 8. However, this was an all-or-nothing permission, giving the developer access to all Photos. New in iOS 14 is the ability for users to choose if they want to grant access to all photos or only specific photos. 

New: Select Specific Photos

The initial photo permission prompt will ask the user if they would like to grant access to one or more specific photos, grant access to all photos, or decline this permission. A user who is simply trying to upload one specific photo may choose only to grant access to that photo. 

If a user only grants access to specific photos, the next time the app requests the photo permission, the user will receive a slightly different permission prompt. The new prompt will ask them if they would like to allow access to more photos or keep the current selection of photos they’ve previously allowed. 

New: Updated Photo Settings

Users can adjust their location settings in the iOS Settings app by navigating to Privacy → Photos  → App Name. Users can choose from the following options: Selected Photos, All Photos, or None. 

If Selected Photos is chosen, then an option to Edit Selected Photos appears. Tapping this presents a Photo Picker, which includes search functionality, the ability to view albums, and the ability to view previously selected photos. 

Note: The permission prompts and settings options only refer to photos. However, the same applies to videos.

Impact

This new privacy change should have minimal impact on apps that require the user to grant permission in order to upload specific photos or videos. The biggest impact will be on apps requiring permission to the entire camera roll, such as Google Photos. 

Recommendations

  • Don’t request photo access until the user is performing an action that requires this, such as uploading a photo or video.
  • Include a usage description clearly explaining why your app requires this permission.
  • Remember that these permission changes apply to videos as well.
  • Remember that if the user chooses Don’t Allow, you won’t request this permission again. The best you can do at that point is to detect that they declined this permission, show some custom prompt, and direct them to the Settings app to enable the permission there. 

Local Network Permission Changes

There are many legitimate reasons an app might need to use the local network. For example, it may connect to a printer, search for nearby players for a game, or control the lights in a home. 

At the same time, there are also less legitimate reasons that apps use the local network. They could be collecting information about the devices on the local network to create a “fingerprint,” which allows them to infer that a user is at home, even if without granting location permission.

In iOS 13, Apple required apps to request permission for access to Bluetooth. Now in iOS 14, they are doing the same for the local network. If your app communicates to devices over your home WiFi, for example, then it is operating over the local network and will trigger this new permission prompt. 

There are exceptions to system-provided services such as AirPrint, AirPlay, AirDrop, or HomeKit. These system services handle device discovery without exposing the full list of devices to the app, so they are exempt from triggering this permission prompt. 

Any other network connections outside the local network (e.g., Web Services, APIs, or other connections to the internet) are not impacted and do not require permission.

New: Local Network Prompt

When an app tries to connect to the local network, it will trigger a Local Network Permission Prompt even if only to view available devices.

Impact

Many applications use the local network for use cases other than the system services previously mentioned. We’ve found that most streaming apps trigger this permission prompt upon launch, likely because they support Google Cast. There may be apps that have Analytics SDKs that collect this type of information. Those apps will also display this prompt upon app launch. 

Recommendations

  • Add logic to defer this permission prompt until the user performs an action that requires it, such as searching for nearby players or casting a video.
  • Include a usage description clearly explaining why your app needs to use the local network.
  • Remember that if you change nothing before iOS 14 release date and your app uses the local network, this permission prompt will be one of the first things the users see when they launch your app on iOS 14. 
  • Remember that if the user chooses Don’t Allow, you won’t request this permission again. The best you can do at that point is to detect that they declined this permission, show some custom prompt, and direct them to the Settings app to enable the permission there. 

Other Privacy Changes

New: Microphone/Camera Indicator

iOS 14 will display a colored dot in the status bar, indicating the current app is actively using the microphone or camera. Be careful not to use any low-level camera/microphone APIs unless the user is performing an action to capture audio or video. 

New: Pasteboard Alerts

iOS 14 will display a banner at the top of the screen, indicating the current app has just extracted the contents of the user’s Pasteboard (also known as clipboard). Some apps use the pasteboard to detect copied URLs to surface the right information when the user moves to their native app. 

Be careful with any Analytics SDKs you include in your app that may be collecting this user data.

More Info

For a quick and easy reference to the iOS 14 permission changes discussed in this blog, download our Location Cheat Sheet:
Download The Location Cheat Sheet

WWDC 2020 Videos

At Phunware, our Engineering team is dedicated to staying up-to-date with the latest changes from Apple WWDC and Google I/O. If you’re a Product Manager looking for a Location or Analytics SDK built by a team that understands these privacy-related changes, then visit our Products page for a complete list of our software products, solutions, and services.

The post Navigating Permission Changes in iOS 14 appeared first on Phunware.

]]>
Phunware Launches New Telemedicine Solution http://100.21.88.205/phunware-launches-new-telemedicine-solution/ Thu, 30 Apr 2020 19:10:20 +0000 http://127.0.0.1/blog/smb-mobile-engagement-offer-copy/ Phunware launches a new telemedicine solution for new and existing healthcare customers of its Multiscreen-as-a-Service (MaaS) platform.

The post Phunware Launches New Telemedicine Solution appeared first on Phunware.

]]>
Today Phunware announced the launch of a new mobile telemedicine solution for new and existing healthcare customers of its Multiscreen-as-a-Service (MaaS) platform.

“Healthcare organizations are being forced to leverage telemedicine in order to stay competitive with their digital transformation initiatives and to address patient concerns about the safety underlying in-person visits in the wake of COVID-19,” said Randall Crowder, COO of Phunware. “Our new solution offers physicians an out-of-the-box telemedicine platform on mobile with streamlined reimbursement that keeps its referrals in-network to help reduce their patient leakage while enhancing their revenues.”

Read the full article from proactive

The post Phunware Launches New Telemedicine Solution appeared first on Phunware.

]]>
Phunware Offers A Free 60-Day License Of Its Mobile Engagement SDK http://100.21.88.205/smb-mobile-engagement-offer/ Thu, 23 Apr 2020 21:39:49 +0000 http://127.0.0.1/blog/phunware-smart-city-solution-launch-copy/ Phunware to offer a free 60-day license of its Mobile Engagement software to qualifying small and midsize businesses that become Phenom Certified.

The post Phunware Offers A Free 60-Day License Of Its Mobile Engagement SDK appeared first on Phunware.

]]>
Phunware recently announced an offer for a free 60-day license of its Mobile Engagement software development kits (SDKs) to qualifying small and midsize businesses (SMBs). In order to receive the SDK at no cost, the qualifying business must complete the Phunware Phenom Certified Developer Program within the next 60 days.

“Our hearts go out to everyone directly affected by COVID-19, but we are just as concerned about the untold toll this pandemic is having on small and midsize businesses nationwide as they scramble to adapt to emerging state and federal guidance,” said Randall Crowder, COO of Phunware. “Our enterprise cloud platform for mobile is uniquely suited to help them not only adhere to these guidelines, but also to engage and manage customers in a mobile-first world that is rapidly becoming mobile-only.”

Read the full article from proactive

Sign up for the Phenom Certified Developer Program

The post Phunware Offers A Free 60-Day License Of Its Mobile Engagement SDK appeared first on Phunware.

]]>
Phunware’s Smart City Solution Launches http://100.21.88.205/phunware-smart-city-solution-launch/ Thu, 09 Apr 2020 20:52:27 +0000 http://127.0.0.1/blog/acg-podcast-copy/ Today Phunware announced the launch of a Smart City Pandemic Response Solution to help government officials during the coronavirus (COVID-19) pandemic.

The post Phunware’s Smart City Solution Launches appeared first on Phunware.

]]>
Today Phunware announced the launch of a Smart City Pandemic Response Solution to help government officials address the critical challenges they are facing in their cities due to the coronavirus (COVID-19) pandemic.

“We think it is extremely important for our country’s mayors and city officials to think globally, but act locally during the current COVID-19 pandemic,” said Alan S. Knitowski, President, CEO and Co-Founder of Phunware. “During such trying times, we believe it is critical for local communities to take swift and decisive action from the bottom up to supplement government efforts being led from the top down at both the federal and state level, including a cogent go-forward plan for addressing the needs of citizens and visitors to each city nationwide in safely getting back to a more normal cadence for their personal and professional lives.”

Learn more about the Smart City Pandemic Response Solution

The post Phunware’s Smart City Solution Launches appeared first on Phunware.

]]>
Phunware’s CEO Interviewed for Association for Corporate Growth Virtual Luncheon http://100.21.88.205/acg-podcast/ Wed, 08 Apr 2020 20:10:30 +0000 http://127.0.0.1/blog/phunware-mobile-pandemic-response-solution-launches-copy/ Check out the recent podcast interview with Phunware CEO, Alan S. Knitowski, and Thom Singer in a special episode for Association for Corporate Growth.

The post Phunware’s CEO Interviewed for Association for Corporate Growth Virtual Luncheon appeared first on Phunware.

]]>
Today Phunware’s President, CEO and Co-Founder, Alan S. Knitowski, was scheduled to present at the Association for Corporate Growth (ACG) luncheon. However, with current restrictions surrounding the COVID-19 pandemic, the luncheon and all live events for the foreseeable future have been cancelled. In an effort to continue to provide members with quality content as we all navigate our new normal, ACG Austin/San Antonio conducted an interview with Mr. Knitowski in a special episode of Thom Singer’s podcast.

Listen to the full interview

The post Phunware’s CEO Interviewed for Association for Corporate Growth Virtual Luncheon appeared first on Phunware.

]]>
Phunware Announces 2019 Earnings and Business Developments http://100.21.88.205/2019-earnings-business-update/ Mon, 30 Mar 2020 20:24:12 +0000 http://127.0.0.1/blog/ventilator-registry-launch-copy/ This week Phunware announced its 2019 financial results and provided an update on recent business developments.

The post Phunware Announces 2019 Earnings and Business Developments appeared first on Phunware.

]]>
This week Phunware announced its 2019 financial results and provided an update on recent business developments.

“Today we are pleased to share our trailing financial results for the Company, which included a dramatic year-over-year revenue transformation from one-time, non-recurring application transactions revenue to annual and multi-year recurring platform subscriptions and services revenue tied to the licensing and use of our Multiscreen as a Service (MaaS) enterprise cloud platform for mobile,” said Alan S. Knitowski, President, CEO and Co-Founder of Phunware. “More importantly, and specific to the subsequent events and recent operational actions taken to address our go-forward business activities while the ongoing COVID-19 coronavirus pandemic continues to unfold worldwide, we have announced a $3 million structured debt financing to address our balance sheet and a furlough of 37 Phunware employees to address our cost structure during the existing governmental stay-in-place orders unique to our business facilities and operations in Central Texas, Southern California and Southern Florida.”

Read the full article from Proactive

The post Phunware Announces 2019 Earnings and Business Developments appeared first on Phunware.

]]>
Blythe Masters Appointed as Phunware Board of Directors Chair http://100.21.88.205/phunware-board-of-directors-chair-blythe-masters/ Mon, 30 Mar 2020 19:30:44 +0000 http://127.0.0.1/blog/2019-earnings-business-update-copy/ Today Phunware is pleased to announced the appointment of Blythe Masters as the new Chair of the Board.

The post Blythe Masters Appointed as Phunware Board of Directors Chair appeared first on Phunware.

]]>
Today Phunware is pleased to announced the appointment of Blythe Masters as the new Chair of the Board. Ms. Masters succeeds Eric Manlunas who will remain with Phunware as a Director and Member of both the Compensation Committee and Audit Committee

“We are living in unprecedented times as the world faces the COVID-19 pandemic, so we are honored and fortunate to have Blythe serve as Chair for Phunware’s Board of Directors,” said Alan S. Knitowski, President, CEO and Co-Founder of Phunware. “Blythe’s proven leadership and experience will be invaluable to helping Phunware navigate the current macro and health environments as we continue to diligently manage cash and drive towards self-sufficiency through operational excellence.”

Read the full article from Proactive

The post Blythe Masters Appointed as Phunware Board of Directors Chair appeared first on Phunware.

]]>
Phunware Announces Launch of its National Ventilator Registry http://100.21.88.205/ventilator-registry-launch/ Fri, 27 Mar 2020 18:50:56 +0000 http://127.0.0.1/blog/issuance-of-senior-convertible-notes-copy/ Phunware asks medical professionals to help compile a National Ventilator Registry launched to help identify and track lifesaving equipment.

The post Phunware Announces Launch of its National Ventilator Registry appeared first on Phunware.

]]>
Today Phunware announced that it has launched a National Ventilator Registry, calling medical professionals to help compile the registry so clinicians have complete visibility into existing resources and can locate lifesaving equipment.

“We have built a data engine that is capable of managing over a billion active devices and four billion daily transactions, while generating more than 5 terabytes of data each day,” said Randall Crowder, COO of Phunware. “We can leverage our technology to identify and track critical medical assets like ventilators, but we need to act now and we need everyone’s help getting the word out to medical professionals on the frontline so that we can collect the information that we desperately need.”

Read the full article from Proactive

Visit the National Ventilator Registry

The post Phunware Announces Launch of its National Ventilator Registry appeared first on Phunware.

]]>
Phunware Announces Issuance of Senior Convertible Notes http://100.21.88.205/issuance-of-senior-convertible-notes/ Mon, 23 Mar 2020 20:40:29 +0000 http://127.0.0.1/blog/avia-vetted-product-copy/ Phunware has entered into a financing transaction with Canaccord Genuity for the issuance of senior convertible notes.

The post Phunware Announces Issuance of Senior Convertible Notes appeared first on Phunware.

]]>
Today Phunware announced that it has entered into a financing transaction with Canaccord Genuity for the issuance of senior convertible notes. Upon closing of the sale, Phunware is expected to receive gross cash proceeds of $2.760 million.

Read the full press release

The post Phunware Announces Issuance of Senior Convertible Notes appeared first on Phunware.

]]>
Phunware Recognized as AVIA Vetted Product http://100.21.88.205/avia-vetted-product/ Thu, 19 Mar 2020 21:04:27 +0000 http://127.0.0.1/blog/top-health-system-location-based-services-copy/ AVIA has recognized the Phunware digital front door software as an AVIA Vetted Product based on the needs and criteria of its members.

The post Phunware Recognized as AVIA Vetted Product appeared first on Phunware.

]]>
Today we announced that AVIA has recognized the Phunware digital front door as an AVIA Vetted Product. These products have been proven to address mobile applications effectively based on the needs and criteria of AVIA Members.

“Phunware is honored to have an AVIA Vetted Product, which will allow us to connect with over 25 distinguished health systems who are committed to digital transformation in a mobile-first world,” said Randall Crowder, COO of Phunware. “We look forward to this partnership with AVIA as we continue to offer health systems an enterprise-wide, best-in-class digital front door.”

Read the full article from Proactive

The post Phunware Recognized as AVIA Vetted Product appeared first on Phunware.

]]>
Phunware Location Based Services Deployed at A Leading US Health System http://100.21.88.205/top-health-system-location-based-services/ Mon, 16 Mar 2020 19:45:11 +0000 http://127.0.0.1/blog/phunware-investor-relations-program-hayden-ir-copy/ Phunware announces the deployment of its Location Based Services for a top US health system.

The post Phunware Location Based Services Deployed at A Leading US Health System appeared first on Phunware.

]]>
We recently announced that our patented Location Based Services, a key component of the award-winning Multiscreen-as-a-Service (MaaS) platform, has been deployed at a leading US healthy system spanning 30 facilities and more than 22 million square feet. 

“The enterprise rollout of this mobile application enabled by our location-based services is another great example of leadership in healthcare innovation and we’re proud to play our part in building a true digital front door,” said Alan S. Knitowski, President, CEO and Co-Founder of Phunware. “Being able to navigate a complex facility easily makes hospital visits less stressful for patients, while being able to reach and inform patients with the push of a button, saving precious time and increasing staff efficiencies.

Read the full article from Proactive

The post Phunware Location Based Services Deployed at A Leading US Health System appeared first on Phunware.

]]>
Phunware to Launch Investor Relations Program with Hayden IR http://100.21.88.205/phunware-investor-relations-program-hayden-ir/ Tue, 10 Mar 2020 15:52:47 +0000 http://127.0.0.1/blog/phunware-new-customer-wins-applications-copy/ Phunware announces launch of investor relations program with Hayden IR

The post Phunware to Launch Investor Relations Program with Hayden IR appeared first on Phunware.

]]>
Phunware announced today it has engaged Hayden IR, a highly recognized, national investor relations firm, to raise its visibility and strengthen its relationships with the investment community.

“Over the past year, we have strengthened our financial position as we approach operating cash flow breakeven and move towards breakeven on an adjusted EBITDA basis,” said Alan S. Knitowski, President, CEO and Co-Founder of Phunware. “To ensure we capitalize on these important milestones, we look forward to working with the team of professionals at Hayden IR to help us target and expand our investor audience and ensure we are communicating effectively with Wall Street.”

Read the full article from Proactive

The post Phunware to Launch Investor Relations Program with Hayden IR appeared first on Phunware.

]]>
Phunware Talks New Customer Wins for Application Transactions http://100.21.88.205/phunware-new-customer-wins-applications/ Fri, 06 Mar 2020 16:52:32 +0000 http://127.0.0.1/blog/phunware-appoints-wikipedia-co-founder-larry-sanger-advisory-board-copy/ Phunware appoints Wikipedia co-founder Larry Sanger to Advisory Board.

The post Phunware Talks New Customer Wins for Application Transactions appeared first on Phunware.

]]>
Phunware announced new customer wins for application transactions using Phunware’s proprietary Audience Engagement solution, which is a managed service capability that enables brands to build custom audiences and deliver targeted media to optimize engagement. 

The Company also recently released new user activity audiences capabilities to its Multiscreen-as-a-Service (MaaS) platform that allows brands to create custom user segments, calculate approximate audience sizes and create cross-platform campaigns among users.

“Phunware has been delivering everything you need to succeed on mobile for over a decade, so helping brands engage audiences with digital media is a natural core competency for us in a mobile-first world,” said Luan Dang, CTO and Co-Founder of Phunware. “Our data-enriched media allows brands to optimize their marketing spend, while our blockchain-enabled data exchange provides improved transparency to combat ad fraud and ensure both brand and consumer protection alike.”

New customer wins included Samsung, Live Nation, Ticketmaster, House of Blues, AEG, Madison Square Garden, Metrolink, Coast Electric, Census 2020, the University of Pennsylvania and Truthfinder amongst others.

Read the full article from Proactive

The post Phunware Talks New Customer Wins for Application Transactions appeared first on Phunware.

]]>
Phunware Adds Top US Cancer Center as Mobile Digital Front Door Customer http://100.21.88.205/phunware-top-rated-cancer-center-digital-front-door/ Mon, 02 Mar 2020 18:28:03 +0000 http://127.0.0.1/blog/phunware-location-based-services-cisco-meraki-copy/ Phunware adds top rated US cancer center as mobile digital front door customer on its Multiscreen-as-a-Service (MaaS) platform.

The post Phunware Adds Top US Cancer Center as Mobile Digital Front Door Customer appeared first on Phunware.

]]>
Phunware has announced that it has added one of the top rated cancer hospitals in the United States as a new customer for its mobile digital front door solution. Phunware’s Multiscreen-as-a-Service (MaaS) platform helps patients and clinicians demystify the healthcare journey for both families and staff. 

“MaaS provides our customers with a true digital front door for their patients and staffs, either end-to-end as a complete turn-key solution off-the-shelf, or as software components and tools that they can license, incorporate and build on their own through convenient and frictionless Github downloads and a comprehensive learning management system known as the Phunware Phenom Certified Developer Program,” said Alan S. Knitowski, President, CEO and Co-Founder of Phunware. “Missed appointments cost the US healthcare system more than $150 billion every year, so we’re extremely excited to enable such a prominent, globally recognized healthcare organization to better manage their patient and clinician experience across more than 14 million square feet of facilities spread over a 40 block downtown metropolitan area.”

Read the full article from Proactive

The post Phunware Adds Top US Cancer Center as Mobile Digital Front Door Customer appeared first on Phunware.

]]>
Phunware’s Location Based Services to be Featured in Cisco Meraki Marketplace http://100.21.88.205/phunware-location-based-services-cisco-meraki/ Tue, 25 Feb 2020 17:12:10 +0000 http://127.0.0.1/blog/phunware-himss20-orlando-florida-copy/ Phunware’s Location Based Services to be Featured in Cisco Meraki Marketplace!

The post Phunware’s Location Based Services to be Featured in Cisco Meraki Marketplace appeared first on Phunware.

]]>
Phunware has announced that Cisco Meraki now features the Company’s Multiscreen-as-a-Service (MaaS) Location Based Services (LBS) app in its Meraki Marketplace, which is an exclusive catalog of Technology Partners like Phunware that showcases applications developed on top of the Meraki platform, allowing customers and partners to view, demo and deploy commercial solutions.

“We recently announced a collaboration debut between Phunware and Cisco Webex called the On My Way mobile app portfolio for South by Southwest (SXSW) attendees in March in conjunction with the Cisco Innovation Hub at Capital Factory, where I’ll be discussing three-dimensional cognitive workspaces,” said Randall Crowder, COO of Phunware. “The Meraki Marketplace will now provide Phunware an important channel to thousands of Cisco Meraki customers across more than 100 countries worldwide who need the very best LBS solutions for their network environments without the risk of deploying unproven technology.”

 

Read the full article from Proactive

The post Phunware’s Location Based Services to be Featured in Cisco Meraki Marketplace appeared first on Phunware.

]]>
SwiftUI: A Game Changer http://100.21.88.205/swiftui-a-game-changer/ http://100.21.88.205/swiftui-a-game-changer/#comments Wed, 17 Jul 2019 16:09:07 +0000 http://127.0.0.1/blog/the-power-of-machine-learning-on-a-user-device-copy/ Last month at WWDC 2019, Apple released a heap of information to continue building on their software platforms. This year’s event was jam packed with new features such as user profiles on tvOS, standalone AppStore on watchOS and dark mode on iOS. Also announced was the stunning Mac Pro and Pro Display which is a […]

The post SwiftUI: A Game Changer appeared first on Phunware.

]]>
Last month at WWDC 2019, Apple released a heap of information to continue building on their software platforms. This year’s event was jam packed with new features such as user profiles on tvOS, standalone AppStore on watchOS and dark mode on iOS. Also announced was the stunning Mac Pro and Pro Display which is a powerhouse of a machine that can tackle extreme processing tasks.

Apple has a recurring theme of releasing mind-blowing features, but nothing was more exciting than the announcement of SwiftUI. As Apple’s VP of Software Engineering, Craig Federighi, announced the new UI toolkit, it felt like a metaphorical bomb dropping in the middle of the room!

Shortly after a quick SwiftUI overview, the keynote was over. Developers were left excited, stunned and filled with hundreds of questions about the new UI framework. It felt like the only thing missing from the SwiftUI announcement was the iconic “One More Thing” introduction slide Steve Jobs was known for using.

The blog explains what SwiftUI is, the benefits of using SwiftUI compared to the current UI programming method and how SwiftUI handles data management.

SwiftUI and Declarative Programming

Let’s take a step back and look at what makes this UI toolkit exciting. SwiftUI let developers build the designs for their apps in a new declarative way. Native iOS developers have only known how to build and maintain their UI through imperative programming. Imperative programming requires the user to maintain every UI state themselves and update each item to keep it in sync with their data models. As your UI elements increase, so does the complexity of your logic management, leading to state problems.

With declarative programming, the developer sets the rules that each view should follow and the framework makes sure those guidelines are enforced. As the user interacts with your UI and your data model changes, the view will rebuild itself to reflect those changes automatically. This vastly reduces code complexity and allows developers to create robust user interfaces with fewer lines of code. Other development frameworks, such as ReactNative and Flutter have already been using this declarative UI paradigm, and developers love how quickly they can put together the UI and how this produces easy to read code.

But the declarative framework is only part of the story. SwiftUI brings even more enhancements to iOS programming, such as live previews in Xcode, drag and drop programming and cross-platform development.

Overview of SwiftUI

In order to display the simplicity and beauty of SwiftUI, I think it’s worth seeing a small sample of code. Let’s think about a single view app that contains a table view. This is a view that iOS developers have programmed countless times. You immediately think of adding a UITableView through Interface Builder or programmatically, then assign its datasource and delegate to your ViewController. You then need to add the required datasource and delegate functions to fill the content of the table view. Before you know it, this simple table view is up to 30 lines of code.

Here’s the Swift code for a basic table view that displays a list of country names:

class MasterViewController: UITableViewController {
    var countries: [Country] = fullCountryList
 
    override func viewDidLoad() {
        super.viewDidLoad()
    }
 
    // MARK: - Table View
    override func numberOfSections(in tableView: UITableView) -> Int {
        return 1
    }
 
    override func tableView(_ tableView: UITableView, numberOfRowsInSection section: Int) -> Int {
        return countries.count
    }
 
    override func tableView(_ tableView: UITableView, cellForRowAt indexPath: IndexPath) -> UITableViewCell {
        let cell = tableView.dequeueReusableCell(withIdentifier: "Cell", for: indexPath)
 
        let country = countries[indexPath.row]
        cell.textLabel?.text = country.name
        return cell
    }
}

Now we can take a look at the code needed to create that same table in SwiftUI:

struct MyTableView : View {
    @State var countries: [Country] = fullCountryList
 
    var body: some View {
        List(countries) { country in
            Text(country.name)
        }
    }
}

Believe it or not, the part of that code that actually displays the table view is the 3 lines of code inside the body computed variable, and that includes the closing bracket. The List struct knows to infer the count and can adjust its cell to display the text.

You’ll notice that MyTableView is of type View. In SwiftUI, a View is a struct that conforms to the View protocol, rather than a class that inherits from a base class like UIView. This protocol requires you to implement the body computed variable, which simply expects a View to be returned. Views are lightweight values that describe how you want your UI to look and SwiftUI handles actually displaying UI on the screen.

Using Xcode 11 and SwiftUI, you now have the canvas on the right panel which shows you a live preview of your code. This preview is created by the PreviewProvider block of code that is automatically added with each new View you create. The beauty of this preview is that it refreshes itself as you make changes to your code without having to build and run with each change.

This will surely decrease development time as you no longer have to compile your entire project to check your minor UI adjustments while working to make your app design pixel perfect to the design specs.

Data Management with SwiftUI

This only scratches the surface of what SwiftUI brings to iOS development. SwiftUI is easy to use but there are advanced features that allow you to take your app to the next level. Developers will want to dive deeper into how data is managed within SwiftUI. To keep your data and UI in sync, you will need to decide which views will maintain the “source of truth” for your app and which views will simply be passed as reference data.

Let’s imagine we’re developing a media player and working on the Player screen. This will have many UI elements, but we’ll simplify it to the play/pause button and a progress view.

Here’s a rough model:

Here you have the PlayerView with smaller SwiftUI views to maintain the PlayButton and ProgressView. Each SwiftUI view will need the isPlaying attribute to know how to update its own UI state, but if each view is maintaining its own value, this could cause state problems.

Instead, we want there to be a “master” isPlaying attribute that all the SwiftUI views can read and react to. Here’s a better model:

The parent PlayerView will hold the master isPlaying attribute and the child views will only reference this variable. When the user interacts with the child UI elements to manipulate the isPlaying boolean, those changes will make their way through the views that are associated with the variable.

Let’s take a look at what this looks like in our code:

struct PlayerView : View {
    let episode: Episode
    @State private var isPlaying: Bool = false
 
    var body: some View {
        VStack {
            Text(episode.title).foregroundColor(isPlaying ? .white : .gray)
 
            PlayButton()
        }
    }
}

This SwiftUI PlayerView is a vertical StackView that has a Text label with the show title and a PlayButton View.

Swift 5.1 will introduce Property Wrappers, which allow SwiftUI to use the keyword @State and @Binding to add additional logic to your view’s variables. In the code above, the PlayerView is the owner of the isPlaying attribute so we indicate this with the @State keyword.

struct PlayButton : View {
    @Binding var isPlaying: Bool
 
    var body: some View {
        Button(action: {
            self.isPlaying.toggle()
        }) {
            Image(systemName: isPlaying ? "pause.circle" : "play.circle")
        }
    }
}

Now looking at the PlayButton code, we have the isPlaying boolean here as well, but we added the @Binding keyword to tell this View that this variable is bound to a @State attribute from a parent view.

When a parent view calls a child view, they can pass the State variable to the Binding variable as a parameter into the View and use the “$” prefix:

struct PlayerView : View {
    let episode: Episode
    @State private var isPlaying: Bool = false
 
    var body: some View {
        VStack {
            Text(episode.title).foregroundColor(isPlaying ? .white : .gray)
 
            PlayButton(isPlaying: $isPlaying)
        }
    }
}

By doing this, when a binding variable is changed by some user interaction, the child view sends that change through the entire view hierarchy up to the state variable so that each view rebuilds itself to reflect this data change. This ensures that all your views maintain the same source of truth with your data models without you having to manage each view manually.

This is a high level introduction to data management with SwiftUI. I encourage you to dig further into this topic by watching the WWDC tech talk, Data Flow Through SwiftUI.

Start Working with SwiftUI

The best way to grow your knowledge of SwiftUI and learn its more advanced functions is to start using it to build an app. The great news is that you don’t have to build an entire app from scratch in order to use SwiftUI. Apple provided classes and protocols that allow you to integrate newly designed SwiftUI views into your existing projects.

So the next feature you work on for your iOS, watchOS or tvOS project, consider developing one of the views in SwiftUI and integrate it into your project.

If you want to keep digging into SwiftUI, check out these WWDC Tech Talks and Tutorials:

Here at Phunware, our architects and developers stay up-to-date with the latest changes from Apple WWDC and Google IO. If you’re interested in joining the Phamily, check out our latest job openings. We’re currently looking for Android and iOS software engineers!

The post SwiftUI: A Game Changer appeared first on Phunware.

]]>
http://100.21.88.205/swiftui-a-game-changer/feed/ 1
The Power of Machine Learning on a User Device http://100.21.88.205/the-power-of-machine-learning-on-a-user-device/ http://100.21.88.205/the-power-of-machine-learning-on-a-user-device/#respond Tue, 02 Jul 2019 21:34:35 +0000 http://127.0.0.1/blog/why-are-brands-afraid-mobile-games-copy/ Until recently, using machine learning inside your products was not a small task. It required a data center with servers running all the times: dedicated space, memory and bandwidth. Now, using the power of machine learning, we can make new, empowering features directly on a user’s device. Today, we’re showing you how easy it can […]

The post The Power of Machine Learning on a User Device appeared first on Phunware.

]]>

Until recently, using machine learning inside your products was not a small task. It required a data center with servers running all the times: dedicated space, memory and bandwidth. Now, using the power of machine learning, we can make new, empowering features directly on a user’s device.

Today, we’re showing you how easy it can be to run your own machine learning on a user device. In our step-by-step tutorial, we’re going to go from getting your data, to training your model on a Mac, to running an iOS app with your newfound powers. Read on for instructions!

Rise of Accessibility for Machine Learning

New tools are making machine learning opportunities more and more accessible. Apple has CoreML, a powerful framework optimized for Apple hardware. And Google has TensorFlow Lite models that are made to fit on phones. Both Apple and Google, at their respective annual conferences, dedicated a significant amount of time talking about how they’ve benefitted from moving machine learning to users’ devices, and how they’re empowering developers on their platforms to do the same. With machine learning on your device, you could add these features through your app:

  • Voice control
  • Facial recognition through an app
  • Offline chatbots to assist with FAQs or onboarding
  • Decipher text from signs for accessibility
  • Scan and store text from business cards or important documents
  • Translate text
  • Recognize objects like cars and identify their make/model/type
  • Convenient typing predictions
  • Keyboards that autocomplete your writing in the style of a famous author
  • Add never-before-seen filters to images
  • Tag photos and videos according to who or what is in them
  • Organize emails and messages by what is most important to you

Advantages of Machine Learning

  1. It’s scalable. As the number of users of your app grows, you don’t have to worry about more traffic with the server, or Internet connection points of failure. You don’t need to get extra memory and storage. And users avoid bandwidth issues because they don’t have to ping the Internet all the time
  2. It’s fast. You’re not hindered by internet latency because you are using hardware that is optimized for machine learning.
  3. It’s private. Your users can be rest assured knowing the information being analyzed is all private. You are not handling their data; everything is happening on their devices at their behest.

That said, there are still costs associated with machine learning. For example, creating the models that will be used on device still requires and depends on massive amounts of quality data and high powered machines. Yet even these features are becoming more readily available and easy to use.

Interested in seeing just how easy it can be? Follow our tutorial below!

Before Getting Started.

  • It will be helpful to know a tiny bit of iOS development, including how to run an app on the simulator through Xcode.
  • Also, familiarity with Swift Playgrounds is helpful but not required.
  • Other than that, we’ll take you through the machine learning process one step at a time.

You can find the full code you’ll be writing at the end of this blog post.

Step 1: Getting the Data.

This tutorial focuses on a kind of machine learning called natural language processing (NLP) – which essentially means, “making sense of words.” Specifically, we will be doing a sentiment analysis. This is where we take a word or phrase and decide what feeling is associated with it. Great use cases for this functionality include marketing analysis of customer feedback, evaluating tester interviews for product design, or getting the lay of the land with comments left on user reviews of a product.

Let’s say you want to use sentiment analysis to organize or display messages on your new messaging app, or your new email client. You can group them by tone, or color-coordinated messages to give the user a heads up of what’s coming, or help them decide what they should answer right away, or whatever else you can imagine as a helpful feature. (And again, we can do all this by offloading the processing power and smarts to the users device without compromising other features users want, like end-to-end encryption).

First though, you’ll need to get the data. Ours will come as a CSV. Most major spreadsheet programs can open a CSV, so you can easily see what the data looks like.

DOWNLOAD SAMPLE CSV

As with any data, we want to be transparent with where we got our information. I’ve cleaned up the linked dataset, but the basics of it come courtesy of work done for this paper:

Maas, A., Daly, R., Pham, P., Huang, D., Ng, A. and Potts, C. (2011). Learning Word Vectors for Sentiment Analysis: Proceedings of the 49th Annual Meeting of the Association for Computational Linguistics: Human Language Technologies. [online] Portland, Oregon, USA: Association for Computational Linguistics, pp.142–150. Available at: http://www.aclweb.org/anthology/P11-1015.

This dataset is basically the CSV form of a simple spreadsheet with two columns.

  • One is labeled “sentiment” and is a column with values of either “Positive” or “Negative”. You may see this in other data sets as 1 for positive and 0 for negative, but for coding purposes we need to format as words instead of integers.
  • The other column is the text of the review, and it is labeled “review” at the top. And there are 25,000 reviews! Go ahead and import this into a spreadsheet to see what it looks like.

This type of machine learning is known as classification and we’ll be making a classifier. The reviews are your “x” inputs, or features, and the “Negative”/“Positive” values – known as labels – are like the “y” values you get as output. Your target prediction is a “Negative” or “Positive” value.

Alright. So if you have downloaded the data, now it’s time to write some code to train the model.

Step 2: Training the Model

Training a model means giving our program a lot of data so that it learns what patterns to look for and how to respond. Once the model is trained, it can be exported as a file to run on a device. That means you’re not taking all those gigabytes of training data with you.

It’s sort of like pouring lots of water over a material to make a sculpture that has the right shape. Our training data is the water. The sculpture is the model. It’s what we’ll use once it is trained and in the right shape.

For this example, we’ll use an Xcode Playground, which is like a blank canvas that runs code and is very useful for experimenting.

  1. Open up Xcode, preferably Xcode 10.2 or later. Your version of iOS should be at least iOS 11. In Xcode go to File > New > Playground. Use macOS as the template, and choose “Blank” from the options. Then click “Next.”
  2. Now it will ask you where to save the project and what to call it. I called mine “CreateMLTextClassifier”.
  3. Save your Playground. It will open up with some boiler plate code. Delete all of that code.

The full code for the playground is available at the end, but we’ll also take you step-by-step.

First we’ll import the frameworks we’ll need at the very top. Add this:

import CreateML
import Foundation
import PlaygroundSupport

Then we’ll create a function that will do the actual magic. Below your import statements, write:

func createSentimentTextClassifier() {
 
}

Now we’ll fill out this function. Write everything in between the brackets until told otherwise. The first thing you’ll write inside the brackets are:

// Load the data from your CSV file
let fileUrl = playgroundSharedDataDirectory.appendingPathComponent("MovieReviewTrainingDatabase.csv")

So we have this line, but in order to make it actually work, we’ll need to set up a folder with our CSV in the right location. What’s happening here is that the Playground is looking for a folder called “Shared Playground Data”. So go ahead and make a folder with that name in your “Documents“ directory, and then add the “MovieReviewTrainingDatabase.csv” to that folder. Now the Playground can find it!

Back to coding. Below the fileUrl lines you just wrote, add:

guard let data = try? MLDataTable(contentsOf: fileUrl) else {
return
}

This takes the CSV file and converts it to a table format that the program knows how to handle better for machine learning.

Next, below the “guard let data …” lines you wrote, write:

// Split the data for training and testing
let (trainingData, testingData) = data.randomSplit(by: 0.8, seed: 5)

This will give you data for training, and testing. This will train the data with 80 percent of what’s in the CSV (that’s what the 0.8 means) and the other 20 percent will be used later. So it will go over and over the training data, now the testing data, which the classifier has never seen, can tell us how well the data would have done in the real world.

As a side note, it’s possible to train your machine learning model so many times on the same data that you “overfit” your model. This means it’s great at working with the training data, but it may not be great at generalizing outside that data. Imagine a facial recognition system that easily identifies my face, but when shown a new face it cannot recognize that it is even a face because it had only ever seen my face. Sort of like that.

Now, below the “trainingData, testingData” lines you wrote, write:

// Make the model
guard let sentimentClassifier = try? MLTextClassifier(trainingData: trainingData, textColumn: "review", labelColumn: "sentiment") else {
return
}

This creates the untrained classifier and gets it ready with the trainingData we made earlier. CoreML already has something called an MLTextClassifier which is specifically meant for this kind of use. So we tell it that the column of our spreadsheet/CSV with our text is the column with “review” written at the top, and the “labelColumn” which will become the labels we’re trying to predict, are in the “sentiment” column of our spreadsheet/CSV.

Now below the previous lines write:

// Training accuracy percentage
let trainingAccuracy = (1.0 - sentimentClassifier.trainingMetrics.classificationError) * 100
print("Training accuracy: \(trainingAccuracy)")

This will let us know during training how accurate our model is getting. It should start small, guessing 50 percent, and then grow to high 90s.

Now below the previous lines write:

// Validation accuracy percentage
let validationAccuracy = (1.0 - sentimentClassifier.validationMetrics.classificationError) * 100
print("Validation accuracy: \(validationAccuracy)")

This tells us about how our validation is going. We have already divided the data between training and testing. Within testing, there is another process of dividing the data between training and validation, so that the data is trained a bunch, but when it comes time for fresh data before going over another cycle of training, we check the validation. It’s yet another standard step that helps avoid overfitting and other such problems.

Now below the previous lines write:

// Testing accuracy percentage
let evaluationMetrics = sentimentClassifier.evaluation(on: testingData)
let evaluationAccuracy = (1.0 - evaluationMetrics.classificationError) * 100
print("Evaluation accuracy: \(evaluationAccuracy)")

This finally tells us how accurate our testing data is after all of our training. It’s the real-world example scenario.

Now below the previous lines write:

// Add metadata
let metadata = MLModelMetadata(author: "Matthew Waller", shortDescription: "A model trained to classify the sentiment of messages", version: "1.0")

This is just metadata saying who made the model, a description, and the version.

And the last part of the function, below the previous lines, is:

// Export for use in Core ML
let exportFileUrl = playgroundSharedDataDirectory.appendingPathComponent("MessageSentimentModel.mlmodel")
try? sentimentClassifier.write(to: exportFileUrl, metadata: metadata)

This exports the model so we can drop it in for use in our app.

Now that you’ve made your function you’re ready to run it!

Below the brackets of the function write:

createSentimentTextClassifier()

Now run the Playground! It may automatically run, or you can press the play icon in the lower left corner.

You should see things like the training, validation, and evaluation accuracy pop up in the console. After everything was parsed and analyzed, my training took 8 seconds. My training accuracy was 100.0, and validation and test data evaluation were at around 88 and 89 percent, respectively.

Not a bad result! Even this tutorial on deep learning, a subset of machine learning, using a modest LSTM (“Long Short-Term Memory”) neural net got about 87 percent accuracy on the test data.

With less than 50 lines of code and about 8 seconds of training, we’ve analyzed 25,000 movie reviews and exported a machine learning model for use. Pretty awesome.

Step 3: Putting Machine Learning to Work

It’s time to get the app ready to use our new model.

I’ve made a skeletal app where we can enter some text, and then automatically evaluate it as positive or negative. With that basic feature up and running, you can imagine entering text from any source, knowing how to classify it, and then presenting it in the right way for the convenience of your user. (And in the future, if you have the labeled data, you could do things like determine whether something is or is not important, or divide text into more categories other than just “Positive” or “Negative”.) The project is available on GitHub.

VIEW GITHUB PROJECT

Once you’ve cloned or downloaded the project, open the project in Xcode. Next open a Finder window for the Shared Playground Data folder you created. Next, drag and drop the “MessageSentimentModel.mlmodel” file you created through the Playground into the Xcode project just below the ViewController.swift file.

When it asks you how you want to import it, check all the checkboxes and use “Create Groups” from the radial options.

Now you’re ready to add the code to make the model work.

Go to the ViewController.swift file, and below “sentimentLabel” add:

let sentimentModel = MessageSentimentModel()

Next uncomment the code in “checkImportanceTapped(_ sender: UIButton)”

So with this line:

guard let languageModel = try? NLModel(mlModel: sentimentModel.model) else {
return
}

This wraps our model in an even easier-to-use framework so that we can take the user’s input and update the text of the sentimentLabel in one line, like so:

sentimentLabel.text = languageModel.predictedLabel(for: text)

And it’s as simple as that!

Now let’s run it.

If we type in “I’m doing well” I get the label “Positive” at the bottom. So far so good!

And “I had a really bad day” is …

And now, we’re off to the races! Play around with it yourself!

I hope you’ve enjoyed this demonstration and primer on machine learning, and can imagine the potential of running AI on device. At Phunware, we’re always working for better quality code. That means figuring out how to apply the latest technologies (such as data binding) to challenging, often high-profile projects. In fact, Phunware’s Knowledge Graph uses machine learning and proprietary algorithms to curate over five terabytes of data every day from approximately one billion active devices each month. This data is then used to provide intelligence for brands, marketers and media buyers to better understand their customers, engage and acquire new customers, and create compelling user experiences.

Feel free to reach out with any questions about the myriad possibilities around mobile (or any sized screen) in this field or others. Thank you for reading!

Interested in joining the Phamily? Check out our latest job openings. We’re currently looking for Android and iOS software engineers!

Full Playground code:

import CreateML
import Foundation
import PlaygroundSupport 
 
func createSentimentTextClassifier() {
// Load the data from your CSV file
let fileUrl = playgroundSharedDataDirectory.appendingPathComponent("MovieReviewTrainingDatabase.csv")
 
guard let data = try? MLDataTable(contentsOf: fileUrl) else {
return
 
// Split the data for training and testing
let (trainingData, testingData) = data.randomSplit(by: 0.8, seed: 5)
 
// Make the model
guard let sentimentClassifier = try? MLTextClassifier(trainingData: trainingData, textColumn: "review", labelColumn: "sentiment") else {
return
}
 
// Training accuracy percentage
let trainingAccuracy = (1.0 - sentimentClassifier.trainingMetrics.classificationError) * 100
print("Training accuracy: \(trainingAccuracy)")
 
// Validation accuracy percentage
let validationAccuracy = (1.0 - sentimentClassifier.validationMetrics.classificationError) * 100
print("Validation accuracy: \(validationAccuracy)")
 
// Testing accuracy percentage
let evaluationMetrics = sentimentClassifier.evaluation(on: testingData)
let evaluationAccuracy = (1.0 - evaluationMetrics.classificationError) * 100
print("Evaluation accuracy: \(evaluationAccuracy)")
 
// Add metadata
let metadata = MLModelMetadata(author: "Matthew Waller", shortDescription: "A model trained to classify the sentiment of messages", version: "1.0")
 
// Export for use in Core ML
let exportFileUrl = playgroundSharedDataDirectory.appendingPathComponent("MessageSentimentModel.mlmodel")
try? sentimentClassifier.write(to: exportFileUrl, metadata: metadata)
}
 
createSentimentTextClassifier()

The post The Power of Machine Learning on a User Device appeared first on Phunware.

]]>
http://100.21.88.205/the-power-of-machine-learning-on-a-user-device/feed/ 0
App Development: Should You Do It In-House or Outsource It? http://100.21.88.205/app-development-should-you-do-it-in-house-or-outsource-it/ http://100.21.88.205/app-development-should-you-do-it-in-house-or-outsource-it/#comments Wed, 05 Aug 2015 14:25:38 +0000 http://127.0.0.1/?p=21010 It’s December 24th. I’m staring at this year’s hottest gift, the Barbie Dreamhouse™. There are hundreds of pieces spread across the living room and instructions written in eight languages, none of which make any sense to me. Time is not on my side either. My four-year-old will walk down the stairs in a few hours […]

The post App Development: Should You Do It In-House or Outsource It? appeared first on Phunware.

]]>
It’s December 24th. I’m staring at this year’s hottest gift, the Barbie Dreamhouse™. There are hundreds of pieces spread across the living room and instructions written in eight languages, none of which make any sense to me.

Time is not on my side either. My four-year-old will walk down the stairs in a few hours and I don’t think she’ll understand the concept of “assembly required” from Santa. So I dig in and prepare for a long night.

After watching several YouTube instructional videos and enlisting some family members (who worked for beer), we finish the project. It was stressful, time-consuming and certainly outside of my skillset. Seeing my daughter’s eyes light up in the morning was rewarding, but I’m not sure her reaction would have been any different if there had been an outsourcing option to build the pink monstrosity. She might have liked it better, in fact—we couldn’t get the elevator to work.

The in-house vs. outsource dilemma plagues businesses too, particularly when it comes to mobile development. Many companies are scrambling to generate mobile apps in-house, believing that it’s cheaper, easier, faster, more controllable and more efficient. If your organization is weighing this decision, consider the following.

Challenges of In-House App Development

Three main challenges arise when businesses attempt to keep all mobile app development in-house:

1. The skillset struggle is real.

Even with up to 15 developers working on their mobile app initiatives, 94 percent of organizations don’t have the necessary mobile development staff to tackle all of their needs. Almost half of software solutions architects and senior software developers say there’s a gap in the skills required for mobile development.

Android and iOS development require different and fairly complicated coding languages—Java for Android and Objective C or Swift for iOS. The average Android developer can’t just switch over to coding for iOS without additional training or study. Creating apps for both platforms effectively means two development efforts and skillsets.

2. It’s expensive and time-consuming.

A bare-bones internal mobile development team might consist of a mobile designer, one or two developers, a project manager and a quality assurance (QA) engineer. Even if you already have some of these folks on staff, you likely need to hire at least one person. It can take weeks to get the HR process rolling and find the right person, and even more time to get them fully on board (average of 3-6 months).

Recruitment and hiring don’t just take time. They take money. Consider the cost of advertising job listings, hiring recruiters, performing background checks and covering relocation expenses—not to mention the developer’s six-figure salary and the cost of technology, licensing fees, software certificates and more.

3. Developing mobile apps in-house can be risky.

If you decide to keep all of your mobile app development in-house, how can you be sure your team’s skills are top-notch? Are you savvy enough to differentiate between a decent coder and a mobile expert? Most people aren’t.

Scalability can also become an issue with an in-house team. What if your project scope expands? As we’ve already established, it’s not so easy to just plug in an additional coder. Accountability can also present challenges. Without specific mobility expertise, decision-makers may struggle to identify the nature and root causes of any problems that arise, leaving the project stalled out without a plan for moving forward.

Mobile is everywhere. Stay up to date on the latest mobile news, trends and content with our monthly newsletter!

SUBSCRIBE TO THE NEWSLETTER

Advantages of Outsourcing App Development

Outsourcing your mobile app development to a firm that specializes in mobile can be a very strategic decision—one that saves you time, hassle, and money while yielding a better-quality product. Here are a few advantages of letting someone else handle your mobile app development:

  1. Fixed costs for a specific scope and delivery.
  2. Less lag time: An outside team can usually start immediately.
  3. Synergy: An established team will have a solid working relationship with each other and with the required technologies.
  4. Accountability: A good mobile firm will give you a solid contract and scope of work, with clearly defined responsibilities and terms. If a mistake or delay occurs, you will have a dedicated account rep to address the problem. There’s a lot less to worry about.
  5. Access to plug-and-play features and modules: Many app features and modules are relatively standard. It’s how you use them that makes the app unique and special. An experienced app development team will have an existing library of these standard products already tested and optimized. There’s no need to build every feature from scratch when you can simply customize a proven solution. This saves time and money, while ensuring performance.
  6. Greater experience and expertise: Because of their focus on mobility, an outsourced team will be on top of the latest trends and technologies. They can share best practices gained from extensive experience and ensure that your app is in line with your vision and your target audience. A dedicated mobile expert can remove the guesswork and put your company and its app in the best possible situation to succeed.
  7. Options: You can outsource part or all of your app development. You can split the work, outsourcing iOS development while keeping Android in-house (or vice versa). You can use outsourced staff augmentation to fill gaps in your in-house development strategy. Or you can outsource the app discovery process, letting third-party pros develop your roadmap.

Ultimately, this decision comes down to cost and risk. Businesses are under intense pressure to maintain a competitive presence in the mobile space, and it’s natural to want to keep mobile development in-house. It just doesn’t make the best business sense. And your proverbial Barbie Dreamhouse might end up with a non-working elevator.

Ready to put mobile first and win? Download Mobile First: Harnessing the App Lifecycle for Transformative Business Success for a strategic and tactical model with actionable items for every stage of the process.

DOWNLOAD THE eBOOK

The post App Development: Should You Do It In-House or Outsource It? appeared first on Phunware.

]]>
http://100.21.88.205/app-development-should-you-do-it-in-house-or-outsource-it/feed/ 1
API, SDK—WTF? Understanding the Mobile App Alphabet Soup http://100.21.88.205/api-sdk-wtf-understanding-mobile-app-alphabet-soup/ http://100.21.88.205/api-sdk-wtf-understanding-mobile-app-alphabet-soup/#respond Thu, 04 Dec 2014 04:11:44 +0000 http://127.0.0.1/?p=16081 Photo by Kyle Mills Hall When it comes to digital and mobile technology, an awful lot of acronyms get tossed around. Everybody just nods and smiles, but many of us don’t really know what all that jargon really means…and nobody wants to raise their hand and say, “Hey, what IS an API, anyway? What about […]

The post API, SDK—WTF? Understanding the Mobile App Alphabet Soup appeared first on Phunware.

]]>
Photo by Kyle Mills Hall

When it comes to digital and mobile technology, an awful lot of acronyms get tossed around. Everybody just nods and smiles, but many of us don’t really know what all that jargon really means…and nobody wants to raise their hand and say, “Hey, what IS an API, anyway? What about an SDK?”

It’s understandable. If you don’t work with them every day, you don’t have much reason to know a lot about SDKs or APIs. We’re going to break them down for you in plain English, so at your next cocktail party or digital marketing meeting, you can throw around those acronyms like a boss (or at least a knowledgeable digital player).

What’s an API?

“API” stands for “application programming interface.” It’s an interface that specifies the way two applications or systems can interact with each other. It lays the ground rules for the conversation.

We all deal with APIs every day. For example, let’s say you’re hungry for pizza. When you open Yelp and search for “pizza,” the app sends a request to the back-end database, using the appropriate API. That request looks something like this: “getNearbyPOIForLatLong():”

The Yelp API requires certain parameters in your request so that the back-end system knows which data to return—latitude and longitude for your location, the search term (in this case, “pizza”), and a search radius (how far to search around your location). Now the Yelp database knows what information to retrieve, the app can display that info on a map or as a list, and you can get your pineapple anchovy pizza with extra olives ASAP.

What’s an SDK?

Pizza-Notification-Phone“SDK” is short for “software development kit.” It refers to a set of pre-written code, documentation and programming tools that developers can use as the foundation for creating new software applications. A mobile SDK is a kit designed specifically for creating apps for mobile devices.

Let’s say your neighborhood Pizza Plaza wants to build an app that incorporates push notifications—those little messages that pop up on your smartphone’s lock screen—to tell customers when their order is ready or when they’re running a special deal on calzones.

To enable this communication, the Pizza Plaza developer(s) might use Phunware’s push notifications SDK. The SDK would contain everything the developer needs to deliver the right message to the right recipient at the right time. Using Phunware’s SDK would make adding push notifications pretty much plug-and-play, saving Pizza Plaza’s dev team a lot of time and hassle.

The value of the SDK is that Pizza Plaza does not have to build all of the services associated with the functionality (in this case, push notifications). Instead, Pizza Plaza can focus on making delicious pizzas and running its business—not on app development.

Interested in learning more about these kinds of push notifications? Learn all about mobile marketing automation in our eBook: Mobile Marketing Automation: Why It Matters and How to Get Started.

DOWNLOAD THE eBOOK

What does all of this mean to you?

If you are considering building a mobile app (or having one built) for your business, SDKs and APIs are your friend. They can make the whole process much more efficient because a developer can use SDKs and APIs to add functionality without having to reinvent the wheel each time.

Here at Phunware, we can make you a custom mobile app from concept through completion. We also have pre-packaged SDKs that give your developers access to Phunware features. For example, our advertising SDK can be added to an existing app, giving it the ability to run ads from the Phunware Advertising network in part of the app’s real estate.

To sum it all up: an SDK helps you build the app. An API lets the app communicate with various web services to deliver really cool functionality. And Phunware is here to help. Get in touch to learn more!

CONTACT US

The post API, SDK—WTF? Understanding the Mobile App Alphabet Soup appeared first on Phunware.

]]>
http://100.21.88.205/api-sdk-wtf-understanding-mobile-app-alphabet-soup/feed/ 0
Launch Announcement: Turner Classic Movies http://100.21.88.205/launch-announcement-turner-classic-movies/ Wed, 06 Nov 2013 09:22:17 +0000 http://tapit-qa.enniscreates.com/?p=912 From Turner Classic Movies and Phunware, Watch TCM is simply the most exciting and in-depth experience you’ll find about classic movies anywhere on a mobile device or computer. With over 300 titles to choose from in any month, 2 LIVE broadcast feeds of TCM, and exceptional in-depth background information, clips, and stunning galleries on every […]

The post Launch Announcement: Turner Classic Movies appeared first on Phunware.

]]>
From Turner Classic Movies and Phunware, Watch TCM is simply the most exciting and in-depth experience you’ll find about classic movies anywhere on a mobile device or computer. With over 300 titles to choose from in any month, 2 LIVE broadcast feeds of TCM, and exceptional in-depth background information, clips, and stunning galleries on every title playing on Turner Classic Movies, prepare to immerse yourself in the beauty of classic film like never before.

Whether you’re on the go with your mobile device or computer, WATCH TCM is now available when you want, where you want, and with the same great attention to detail and passion around the history and legacy of movies you expect from Turner Classic Movies.

Features:

watchtcm-resized-600

  • Two Live Streams: a East and West Coast feed of Turner Classic Movies. Watch 2 different movies LIVE at any time, UNCUT and COMMERCIAL FREE.
  • Hundreds of On Demand Moves! That’s right, nearly every title playing on TCM is available to watch On Demand. Includes our introductions from TCM hosts Robert Osborne and Ben Mankiewicz. UNCUT, COMMERCIAL FREE, and presented in their original aspect ratios, preserving the film’s presentations the way they were meant to be seen.
  • Gorgeous, responsive design, featuring a slate black color approach to enhance viewing on portable devices.
  • Interactive Schedule: 2 month schedule (!) to help you plan your viewing, with information accessed from TCM’s critically acclaimed Movie database. Also includes listings for Short Films playing on TCM. Look for Movie Shorts coming to On Demand soon!
  • Movie Alerts and Actor Subscriptions: Subscribe to Movie Stars you love and get notified when they are available in WATCH TCM. Get alerts on films in our upcoming schedule when they are available to play ON Demand.
  • Watchlist: Add any movie to your queue to watch later
  • Browse by TCM Themes: quickly access On Demand films by popular themes such as THE Essentials, Star of the Month, Silent Sunday Nights, and more. Sort films with additional helpful filters.
  • Fan Feed: Sign in with Twitter or Facebook and leave your comments on any film playing On Demand! Add to comments from others, or just enjoy viewing the fan comments on the film yourself – it’s whole new exciting way to experience classic movies.
  • Access 1000s of short form clips and trailers to preview titles and see what’s playing or upcoming
  • Stunning, exclusive, many never-before-seen image galleries from titles playing on TCM in the month. Post them to Instagram!.
  • Share Images and other content via Twitter, Facebook, and Instagram
  • Introducing the TCM BLOG READER: explore great blog writing from across the web featuring the latest news and great writing on Classic Film, personally selected from TCM staff writers.
  • In-Depth information on every title playing on TCM right at your fingertips: feature length articles, cast & crew, stunning image galleries, background information, complete synopsis and more.
  • Shop.tcm.com highlights and features. Get movies playing on TCM in our store and more.
  • Search across the entire 2 month schedule for favorite stars, films, clips. Access titles and set Reminders.
  • Retro TCM Clock

Watch TCM is free. To view LIVE broadcasts and On Demand titles, you’ll need to login with your cable or satellite provider user name and password. Please note: not all cable and satellite providers are currently available with Watch TCM.

iTunes
Google Play

The post Launch Announcement: Turner Classic Movies appeared first on Phunware.

]]>
The iOS 7 Survival Guide for Mobile Application Developers http://100.21.88.205/ios-7-survival-guide-mobile-application-developers/ Thu, 26 Sep 2013 10:23:13 +0000 http://tapit-qa.enniscreates.com/?p=865 While much of iOS 7 may seem cosmetic and surface level, mobile application developers now have access to amazing new capabilities for enhancing user-experiences to keep people coming back for more…we created a ten page guide to help you capitalize on the opportunity. Download Phunware iOS 7 Survival Guide The iOS 7 Survival Guide for […]

The post The iOS 7 Survival Guide for Mobile Application Developers appeared first on Phunware.

]]>
Screenshot 2014-06-03 17.03.44

While much of iOS 7 may seem cosmetic and surface level, mobile application developers now have access to amazing new capabilities for enhancing user-experiences to keep people coming back for more…we created a ten page guide to help you capitalize on the opportunity.

Download Phunware iOS 7 Survival Guide

The post The iOS 7 Survival Guide for Mobile Application Developers appeared first on Phunware.

]]>