Where can I find documentation or examples of ua-parser for SEO?
The Ultimate Authoritative Guide to ua-parser for SEO: Finding Documentation and Examples
By [Your Name/Publication Name], Tech Journalist
Published: October 26, 2023
Executive Summary
In the ever-evolving landscape of search engine optimization (SEO), understanding the nuances of user agents is paramount. Search engine bots, mobile devices, various browsers, and different operating systems all present unique challenges and opportunities for website visibility. The ua-parser library, a powerful and versatile tool, stands at the forefront of deciphering these complex user agent strings. This guide serves as an exhaustive resource for SEO professionals and developers seeking to harness the full potential of ua-parser. We will meticulously explore where to find comprehensive documentation, practical code examples, and delve into the core technical underpinnings of user agent parsing, its critical role in SEO, and its future implications. By the end of this guide, you will possess a deep understanding of how to leverage ua-parser to gain a significant competitive edge in search rankings.
Deep Technical Analysis: The Anatomy of User Agent Parsing
A user agent string is essentially a textual identifier sent by a client (typically a web browser or a bot) to a web server. It contains a wealth of information about the client's software and hardware, including its operating system, browser name and version, rendering engine, and sometimes even device type. For SEO, this information is invaluable for several reasons:
- Search Engine Bot Identification: Distinguishing between legitimate search engine crawlers (like Googlebot, Bingbot) and malicious bots is crucial for accurate analytics and security.
- Device and Platform Optimization: Understanding the prevalence of mobile devices, specific operating systems (iOS, Android), and browsers used by your target audience allows for tailored content and UX.
- Performance Tuning: Identifying older or less capable browsers can inform decisions about progressive enhancement or fallback strategies.
- Content Personalization: Serving content that is most relevant to a user's device or operating system can significantly improve engagement.
The Structure of a User Agent String
User agent strings are not standardized and can vary wildly. However, they generally follow a pattern:
Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/118.0.0.0 Safari/537.36
Let's break down this example:
Mozilla/5.0: A historical artifact indicating compatibility with Netscape Navigator.(Windows NT 10.0; Win64; x64): Operating System details (Windows 10, 64-bit).AppleWebKit/537.36 (KHTML, like Gecko): Rendering engine information.Chrome/118.0.0.0: Browser name and version (Chrome 118).Safari/537.36: Often included to ensure compatibility with websites that check for Safari.
How ua-parser Works
ua-parser operates by leveraging a set of predefined regular expressions and patterns. When provided with a user agent string, it systematically attempts to match parts of the string against these patterns to extract structured data. This typically involves:
- Pattern Matching: Applying regular expressions to identify known browser families, operating systems, and device types.
- Data Extraction: Extracting specific versions, names, and other relevant attributes.
- Normalization: Presenting the extracted data in a consistent, structured format (e.g., JSON).
Key Libraries and Implementations
ua-parser is not a single monolithic tool but rather a concept with implementations across various programming languages. The most prominent and widely adopted are:
ua-parser-js: The JavaScript implementation, ideal for client-side parsing (though often used server-side with Node.js).ua-parser(Python): A robust Python library for server-side user agent parsing.ua-parser(Java): A Java port for server-side applications.ua-parser(PHP): A PHP library for server-side parsing.
The underlying data used by these libraries is often maintained collaboratively, ensuring that they are updated to recognize new browsers, devices, and bots as they emerge.
Finding Documentation and Examples for ua-parser in SEO
The key to effectively using ua-parser for SEO lies in accessing reliable documentation and practical examples. Fortunately, the open-source nature of these libraries means that resources are readily available.
Official Project Repositories and Documentation
The primary source for any open-source library is its official repository. Here, you'll find the most up-to-date information, installation instructions, API references, and often, example usage.
ua-parser-js:- GitHub Repository: github.com/ua-parser/ua-parser-js
- Documentation: The README file on the GitHub repository is usually extensive. Look for sections on installation, usage, and the output structure.
- npm Package: npmjs.com/package/ua-parser-js (often includes basic usage examples).
ua-parser(Python):- GitHub Repository: github.com/ua-parser/ua-parser (This repository often links to specific language implementations, e.g., for Python).
- PyPI Package: pypi.org/project/ua-parser/ (Provides installation and basic usage).
- Documentation: Search for specific Python implementations of ua-parser on GitHub, as the core project might be a meta-repository. A common choice is often found under the `ua-parser` organization on GitHub.
- Other Language Implementations: For Java, PHP, and other languages, search for "ua-parser [language name]" on their respective package repositories (e.g., Maven Central for Java, Packagist for PHP) and their GitHub organizations.
SEO-Focused Examples and Tutorials
While official documentation is crucial, finding examples specifically tailored to SEO use cases requires looking beyond the core library. These can be found on tech blogs, SEO forums, and developer communities.
- Blog Posts: Search for terms like "ua-parser SEO examples," "user agent parsing for Googlebot detection," "optimizing for mobile with ua-parser," or "detecting headless CMS with user agents." Many SEO experts and developers share their findings and code snippets.
- Stack Overflow: This is an invaluable resource for specific coding challenges. Search for questions related to
ua-parserand SEO keywords. You'll often find practical code snippets and solutions to common problems. - Developer Forums and Communities: Platforms like Dev.to, Medium, and various language-specific forums often feature articles and discussions on using libraries like
ua-parserfor practical applications, including SEO. - Web Analytics Platforms: While not directly providing
ua-parsercode, platforms like Google Analytics, Matomo, and Adobe Analytics provide aggregated data on user agents, which can guide your understanding of what data is valuable and how to interpret it usingua-parser.
Key Information to Look for in Documentation/Examples:
- Installation Guides: How to add the library to your project.
- Basic Usage: Simple code examples to parse a user agent string.
- Output Structure: Understanding the JSON or object format of the parsed data (e.g., browser name, version, OS name, OS version, device brand, device model).
- Data Fields for SEO: Identifying which parsed fields are most relevant for SEO tasks.
- Handling Edge Cases: How the library deals with unknown or malformed user agent strings.
- Updating the Regexes: Information on how to update the parsing rules to recognize new agents.
5+ Practical Scenarios for ua-parser in SEO
The true power of ua-parser for SEO is realized through its application in various strategic scenarios. Here are some key examples:
Scenario 1: Accurate Search Engine Bot Identification and Analytics
Problem: Differentiating between legitimate search engine crawlers and other bots (scrapers, malicious bots) is crucial for understanding traffic sources and ensuring accurate SEO performance metrics. Misattributing traffic can lead to flawed decision-making.
Solution: Use ua-parser to identify bots. You can then log or segment traffic based on whether it's from a known search engine crawler.
Example (Conceptual - JavaScript/Node.js):
const UAParser = require('ua-parser-js');
const userAgent = req.headers['user-agent']; // Assuming Express.js
const parser = new UAParser(userAgent);
const result = parser.getResult();
if (result.device.type === 'bot') {
if (result.browser.name === 'Googlebot') {
console.log('Traffic from Googlebot.');
// Log as search engine crawl
} else if (result.browser.name === 'Bingbot') {
console.log('Traffic from Bingbot.');
// Log as search engine crawl
} else {
console.log(`Traffic from unknown bot: ${result.browser.name}`);
// Log as other bot traffic
}
} else {
console.log(`Traffic from user: ${result.browser.name} on ${result.os.name}`);
// Log as regular user traffic
}
SEO Benefit: Clean analytics, accurate crawl budget analysis, and the ability to serve specific content or adjust crawl rates for bots.
Scenario 2: Optimizing for Mobile-First Indexing
Problem: Google's mobile-first indexing means that the mobile version of your content is used for indexing and ranking. Ensuring your mobile experience is superior is critical.
Solution: Use ua-parser to identify mobile devices and deliver a tailored experience or at least log mobile traffic for analysis.
Example (Conceptual - Python):
from ua_parser import user_agent_parser
user_agent_string = "Mozilla/5.0 (iPhone; CPU iPhone OS 13_5 like Mac OS X) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/13.1.1 Mobile/15E148 Safari/604.1"
parsed_ua = user_agent_parser.Parse(user_agent_string)
if parsed_ua['device']['family'] != 'Other' and parsed_ua['os']['family'] in ['iOS', 'Android']:
print("This is a mobile device.")
# Potentially serve a mobile-optimized version of the page
# Or log for mobile traffic analysis
else:
print("This is not a mobile device.")
SEO Benefit: Improved mobile user experience, better rankings in mobile-first indexing, and reduced bounce rates from mobile users.
Scenario 3: Content Personalization Based on User Environment
Problem: Users on different operating systems or devices might prefer or benefit from different content formats or presentations.
Solution: Serve dynamic content based on the parsed user agent. For example, a technical tutorial might link to a specific IDE plugin for Windows users versus macOS users.
Example (Conceptual - PHP):
<?php
require_once 'vendor/autoload.php'; // If using Composer
use UAParser\Parser;
$uaString = $_SERVER['HTTP_USER_AGENT'];
$parser = new Parser();
$result = $parser->parse($uaString);
$osName = $result->getOs()->getName();
if ($osName === 'Windows') {
echo "<p>Download our Windows-specific tool here: <a href='/download/windows'>Download</a></p>";
} elseif ($osName === 'OS X') {
echo "<p>Download our macOS tool here: <a href='/download/macos'>Download</a></p>";
} else {
echo "<p>Check our general download page: <a href='/download/general'>Download</a></p>";
}
?>
SEO Benefit: Increased user engagement, longer time on site, and potentially improved conversion rates, which are indirect SEO ranking factors.
Scenario 4: Identifying and Managing Crawl Budget
Problem: Search engines allocate a "crawl budget" to each website, determining how many pages they will crawl and how often. Inefficient crawling by unimportant bots can waste this budget.
Solution: Use ua-parser to identify non-essential bots and potentially block or limit their access to reduce wasted crawl budget. This allows search engines to focus on more important pages.
Example (Conceptual - Server-side Logic):
On your web server (e.g., Nginx, Apache, or within your application logic), you can inspect the user agent. If it's identified by ua-parser as a known scraper or a bot you don't want to give high priority, you can serve a 403 Forbidden or a 410 Gone status code, or even a lighter version of the page.
SEO Benefit: Ensures that valuable pages are crawled and indexed efficiently by major search engines, preventing important content from being missed.
Scenario 5: Supporting Legacy Browsers and Progressive Enhancement
Problem: While less common now, some users might still access websites with older browsers that lack modern web standards. Ensuring a baseline experience is important for accessibility and broader reach.
Solution: Parse the browser version using ua-parser. If a user is on a very old browser, you can serve a simpler, more compatible version of the page or provide clear indicators about browser update recommendations.
Example (Conceptual - JavaScript):
const UAParser = require('ua-parser-js');
const parser = new UAParser(navigator.userAgent); // Client-side parsing
const browserVersion = parseInt(parser.getBrowser().version, 10);
if (browserVersion < 60) { // Example: If browser version is less than 60
console.log('You are using an older browser. Some features may not be available.');
// Load alternative stylesheets or disable JavaScript features
document.body.classList.add('legacy-browser');
}
SEO Benefit: Wider accessibility, ensuring that users with less common or older browsers can still access your content, contributing to overall reach.
Scenario 6: Detecting Headless CMS and Static Site Generators
Problem: Websites built with modern architectures like Headless CMS or Static Site Generators (SSGs) often have distinct user agent signatures, especially for their preview or build environments.
Solution: By analyzing these specific user agent strings, you can identify the underlying technology stack. This can be useful for targeted SEO strategies, like optimizing for the specific rendering mechanisms of a framework or understanding how preview environments interact with your site.
Example (Conceptual - Node.js):
const UAParser = require('ua-parser-js');
const userAgent = req.headers['user-agent'];
const parser = new UAParser(userAgent);
const result = parser.getResult();
// Example: Detecting a common headless CMS preview bot
if (result.device.type === 'bot' && result.browser.name && result.browser.name.includes('MyAppPreviewBot')) {
console.log('This request is likely from a Headless CMS preview.');
// Apply specific SEO optimizations for preview environments, e.g., noindexing
}
SEO Benefit: Enables fine-tuned optimization for modern web architectures, ensuring that search engines correctly index content generated by these systems and that preview environments don't interfere with live SEO.
Global Industry Standards and Best Practices in User Agent Parsing for SEO
While there isn't a single, universally mandated standard for user agent strings themselves (they are often proprietary), the way we *use* the information derived from them for SEO is guided by several industry best practices and evolving standards.
W3C Recommendations and Guidelines
The World Wide Web Consortium (W3C) sets standards for the web. While they don't dictate user agent string formats, their guidelines on accessibility, mobile web design, and SEO indirectly inform how user agent parsing is beneficial.
- Mobile Web Best Practices: Emphasize the importance of delivering a good experience across devices, making user agent detection for mobile optimization a de facto standard.
- SEO Best Practices: Google and other search engines provide guidelines that indirectly support user agent parsing, such as the importance of clear content for all users (including bots) and mobile-friendliness.
Google's Guidelines for Bots
Google provides specific documentation on how webmasters should handle Googlebot. This includes how to identify it and what to expect from its crawling behavior.
- Identifying Googlebot: Google recommends verifying Googlebot by performing a reverse DNS lookup on its IP address and checking that the hostname resolves to an IP address that Google owns. Using
ua-parserto identify the string "Googlebot" is a primary step, but verification is key for critical security decisions. - Crawl Budget Management: Google's advice on crawl budget management implicitly encourages understanding and controlling bot traffic, which
ua-parserfacilitates.
IAB Standards
The Interactive Advertising Bureau (IAB) has standards related to ad delivery and measurement, which sometimes involve user agent parsing to understand the viewing environment (e.g., desktop, mobile app, connected TV). While not directly SEO, these standards highlight the importance of structured user agent data in digital media.
Data Privacy and Ethical Considerations
As user agent strings can sometimes contain information that, when combined with other data, could potentially identify users, ethical considerations are paramount.
- Anonymization: Ensure that any data collected through user agent parsing is anonymized and not used to track individual users without consent.
- Purpose Limitation: Use the parsed data strictly for the intended SEO purposes (analytics, optimization) and not for invasive profiling.
- GDPR and CCPA Compliance: Be mindful of data privacy regulations, especially when collecting and processing any user-related information.
The Role of Regex Updates
A critical best practice for maintaining the effectiveness of ua-parser is keeping its underlying regular expressions and data patterns up-to-date. This is an ongoing effort, as new browsers, devices, and bots are released constantly.
- Community Contributions: The
ua-parserproject thrives on community contributions. Regularly checking the project's GitHub repository for updates and new regex patterns is advisable. - Automated Updates: Some implementations might offer mechanisms for automatically fetching updated regex data.
Multi-Language Code Vault: ua-parser Examples
To provide a comprehensive resource, here are practical code snippets for using ua-parser in some of the most common programming languages relevant to web development and SEO.
1. JavaScript (Node.js / Browser)
Installation:
npm install ua-parser-js
Example:
const UAParser = require('ua-parser-js');
function parseUserAgent(userAgentString) {
const parser = new UAParser(userAgentString);
const result = parser.getResult();
console.log('--- Parsed User Agent ---');
console.log('Browser:', result.browser.name, result.browser.version);
console.log('OS:', result.os.name, result.os.version);
console.log('Device:', result.device.vendor, result.device.model, result.device.type);
console.log('CPU:', result.cpu.architecture);
console.log('-------------------------');
// Example SEO Application: Check if it's Googlebot
if (result.device.type === 'bot' && result.browser.name === 'Googlebot') {
console.log('This is Googlebot. Relevant for SEO crawling.');
}
// Example SEO Application: Check if it's a mobile device
if (result.device.type === 'mobile') {
console.log('This is a mobile device. Important for mobile-first indexing.');
}
return result;
}
// --- Test Cases ---
const uaChromeDesktop = "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/118.0.0.0 Safari/537.36";
const uaGooglebot = "Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)";
const uaIphone = "Mozilla/5.0 (iPhone; CPU iPhone OS 16_6 like Mac OS X) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/16.6 Mobile/15E148 Safari/604.1";
const uaUnknownBot = "MyCoolScraper/1.0";
parseUserAgent(uaChromeDesktop);
parseUserAgent(uaGooglebot);
parseUserAgent(uaIphone);
parseUserAgent(uaUnknownBot);
2. Python
Installation:
pip install ua-parser
Example:
from ua_parser import user_agent_parser
def parse_user_agent_python(user_agent_string):
parsed_ua = user_agent_parser.Parse(user_agent_string)
print('--- Parsed User Agent ---')
print(f"Browser: {parsed_ua['user_agent']['family']} {parsed_ua['user_agent']['major']}.{parsed_ua['user_agent']['minor']}")
print(f"OS: {parsed_ua['os']['family']} {parsed_ua['os']['major']}.{parsed_ua['os']['minor']}")
print(f"Device: {parsed_ua['device']['family']}")
print('-------------------------')
# Example SEO Application: Check if it's Googlebot
if parsed_ua['user_agent']['family'] == 'Googlebot':
print("This is Googlebot. Relevant for SEO crawling.")
# Example SEO Application: Check if it's a mobile device
if parsed_ua['device']['family'] in ['iPhone', 'Android', 'iPad', 'iPod']: # Common mobile families
print("This is a mobile device. Important for mobile-first indexing.")
return parsed_ua
# --- Test Cases ---
ua_chrome_desktop_py = "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/118.0.0.0 Safari/537.36"
ua_googlebot_py = "Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)"
ua_iphone_py = "Mozilla/5.0 (iPhone; CPU iPhone OS 16_6 like Mac OS X) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/16.6 Mobile/15E148 Safari/604.1"
ua_unknown_bot_py = "MyCoolScraper/1.0"
parse_user_agent_python(ua_chrome_desktop_py)
parse_user_agent_python(ua_googlebot_py)
parse_user_agent_python(ua_iphone_py)
parse_user_agent_python(ua_unknown_bot_py)
3. PHP
Installation (using Composer):
composer require ua-parser/ua-parser
Example:
<?php
require_once 'vendor/autoload.php'; // If using Composer
use UAParser\Parser;
function parse_user_agent_php($userAgentString) {
$parser = new Parser();
$result = $parser->parse($userAgentString);
echo "--- Parsed User Agent ---\n";
echo "Browser: " . $result->getBrowser()->getName() . " " . $result->getBrowser()->getVersion() . "\n";
echo "OS: " . $result->getOs()->getName() . " " . $result->getOs()->getVersion() . "\n";
echo "Device: " . $result->getDevice()->getVendor() . " " . $result->getDevice()->getModel() . " " . $result->getDevice()->getType() . "\n";
echo "-------------------------\n";
// Example SEO Application: Check if it's Googlebot
if ($result->getDevice()->getType() === 'bot' && $result->getBrowser()->getName() === 'Googlebot') {
echo "This is Googlebot. Relevant for SEO crawling.\n";
}
// Example SEO Application: Check if it's a mobile device
if ($result->getDevice()->getType() === 'smartphone') {
echo "This is a mobile device. Important for mobile-first indexing.\n";
}
return $result;
}
// --- Test Cases ---
$uaChromeDesktopPHP = "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/118.0.0.0 Safari/537.36";
$uaGooglebotPHP = "Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)";
$uaIphonePHP = "Mozilla/5.0 (iPhone; CPU iPhone OS 16_6 like Mac OS X) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/16.6 Mobile/15E148 Safari/604.1";
$uaUnknownBotPHP = "MyCoolScraper/1.0";
parse_user_agent_php($uaChromeDesktopPHP);
parse_user_agent_php($uaGooglebotPHP);
parse_user_agent_php($uaIphonePHP);
parse_user_agent_php($uaUnknownBotPHP);
4. Java
Installation (Maven dependency):
<dependency>
<groupId>nl.basjes.parse.useragent</groupId>
<artifactId>yauaa</artifactId>
<version>7.23.0</version> <!-- Check for the latest version -->
</dependency>
Example:
import nl.basjes.parse.useragent.UAParser;
import nl.basjes.parse.useragent.AnalyzeResult;
public class UAParserExample {
public static void main(String[] args) {
String uaChromeDesktop = "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/118.0.0.0 Safari/537.36";
String uaGooglebot = "Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)";
String uaIphone = "Mozilla/5.0 (iPhone; CPU iPhone OS 16_6 like Mac OS X) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/16.6 Mobile/15E148 Safari/604.1";
String uaUnknownBot = "MyCoolScraper/1.0";
parseUserAgentJava(uaChromeDesktop);
parseUserAgentJava(uaGooglebot);
parseUserAgentJava(uaIphone);
parseUserAgentJava(uaUnknownBot);
}
public static void parseUserAgentJava(String userAgentString) {
UAParser parser = new UAParser(new nl.basjes.parse.useragent.parser.UserAgentAnalyzer.Builder(1).hideMatcherTrace().build());
AnalyzeResult result = parser.analyze(userAgentString);
System.out.println("--- Parsed User Agent ---");
System.out.println("Browser: " + result.getBrowserName() + " " + result.getBrowserVersion());
System.out.println("OS: " + result.getOperatingSystemName() + " " + result.getOperatingSystemVersion());
System.out.println("Device: " + result.getDeviceName());
System.out.println("-------------------------");
// Example SEO Application: Check if it's Googlebot
if (result.getOperatingSystemName().contains("Googlebot")) { // Yauaa categorizes bots under OS
System.out.println("This is Googlebot. Relevant for SEO crawling.");
}
// Example SEO Application: Check if it's a mobile device
if (result.getDeviceName().contains("iPhone") || result.getDeviceName().contains("Android")) {
System.out.println("This is a mobile device. Important for mobile-first indexing.");
}
}
}
Future Outlook: AI, Machine Learning, and Advanced User Agent Analysis
The field of user agent analysis, while rooted in pattern matching, is poised for further evolution, driven by advancements in artificial intelligence and machine learning.
AI-Powered User Agent Detection
While ua-parser relies on predefined rules, future systems may employ machine learning models to identify user agents. These models could:
- Learn from Data: Continuously learn from vast datasets of user agent strings and their corresponding behaviors or classifications.
- Detect Anomalies: Identify novel or sophisticated bots that deviate from known patterns, improving bot detection and security.
- Predict User Intent: Potentially infer more about user intent or context based on subtle variations in user agent strings, beyond simple classification.
Integration with Advanced SEO Tools
ua-parser and similar technologies will likely become more deeply integrated into advanced SEO platforms and analytics suites. This will enable:
- Real-time SEO Adjustments: Automated adjustments to website content or caching strategies based on detected user agents and their SEO impact.
- Predictive SEO Analytics: Using parsed user agent data in conjunction with other signals to predict future search trends or user behavior.
- Enhanced Crawl Budget Optimization: More sophisticated algorithms that use AI to prioritize crawling and indexing based on a deeper understanding of bot behavior.
The Evolving User Agent Landscape
As new technologies emerge (e.g., more sophisticated AI agents, immersive web experiences, interconnected devices), user agent strings will continue to evolve. This will necessitate ongoing development and adaptation of parsing tools like ua-parser.
- Augmented Reality (AR) and Virtual Reality (VR): User agents for AR/VR browsers or applications will present new challenges and opportunities for content delivery.
- IoT Devices: As more Internet of Things (IoT) devices interact with web services, their user agent strings will need to be parsed and understood.
In conclusion, ua-parser is an indispensable tool for modern SEO. By understanding where to find its documentation, practical examples, and by applying its capabilities across various strategic scenarios, SEO professionals can significantly enhance their website's visibility, user experience, and overall search engine performance. The ongoing evolution of user agent analysis, fueled by AI, promises even more sophisticated applications in the future, making a solid grasp of this technology a critical asset for any serious SEO practitioner.
© 2023 [Your Name/Publication Name]. All rights reserved.