ࡱ> !` bjbj\\ 4>>v&(((((((f(((8( (40) )",),),)***~///////$0h53/-(,**,,/((,),)/~/~/~/,(,)(,)~/~/,~/~/~/((~/,)( q (-"~/~//00~/3-r3~/3(~/**|~/1+d+]***//V/(***0,,,,$>"D>"<rb((((((  Content Filtering: Sifting Through the Mess NASA SEWP Security Center Aaron Powell Christopher Vincent July 20, 2006 DISCLAIMER: This document is intended for informational purposes only and is no substitute for performing ones own analysis of the products and solutions discussed herein. It represents the NASA SEWP Security Centers analysis and opinions. There are no express or implied warranties regarding the veracity of the information provided. When implementing any content filtering solution, it would be wise for one to seek legal counsel beforehand. Introduction As business, government, and non-profit organizations continue to combat information insecurity, they have made some headway against common problems that once plagued network administrators. Regular anti-virus scans, locked-down firewall rule-sets, persistent log monitoring and strong password policies have reduced some of the more egregious offenses against information technology infrastructures. As a result, browser-based vulnerabilities have received increased scrutiny as a possible avenue for improving security. In response to these threats and others, vendors are marketing numerous suites of products claiming to protect end-hosts from attacks via the web and web-related services. Their products, both hardware and software, are usually referred to as content-filtering products. Web filters have existed for some time, primarily as a means to prevent children from viewing material deemed inappropriate and to prevent users on public machines from accessing illegal material, but this new generation of products hopes to address newly identified problems as well. Currently, organizations face several different problems that they hope to address via "content-filtering." First, organizations hope to prevent authorized users from transporting and storing illegal materials via their networks and machines, possibly creating liabilities on the part of the company. Second, organizations hope to prevent authorized users from wasting company resources by using the internet for legal non-work related purposes. Finally, organizations hope to protect their equipment from vulnerabilities introduced by spyware, adware, viruses, and other malware, including malicious web content that exploits vulnerabilities in the web-browser, itself. These problems are distinct but related because the same sites that host illegal or productivity-reducing content often include spyware in their products and use browser-based vulnerabilities to attack the hosts that visit their sites. The network administrator has several options as to how to implement a content-filtering solution. A) The filtering can be done inline, by dropping unauthorized packets, or by eavesdropping and killing unauthorized connections. B) The filtering can be performed via a devoted hardware device or via software installed on a device. C) The filtering policy can be applied uniquely to individual workstations or to the organization as a whole. D) The filter can look only at the means of communication (the URL and the port), or it can look at the contents of the communication by parsing and processing the payload of the packets in transmission. In all cases, filtering relies on large databases of current information about possible threats, either in the form of a list of unauthorized URLs, a list of unauthorized ports and IP ranges, or a set of signatures of known or potentially unauthorized content. The databases require constant updating to remain current and effective and some proportion of the systems resources will be devoted to contacting a centralized server to obtain new URL assessments and new signatures. Technology behind Content Filtering Inline Filtering vs. Eavesdropping Filtering products achieve the desired effect of interrupting unwanted flows of information in multiple ways. If the filtering product sits on the gateway route (or on the actual gateway), the product can filter by dropping any packets that have an undesirable characteristic. For example, if a content filter receives a packet originating at www.apple.com and the filter is configured to block www.apple.com, the filtering application can erase the packet instead of forwarding it to the host. This is how a typical firewall operates. In addition, firewall rule-sets typically perform some basic sanity checks to prevent local packets from misidentifying their source IP address in some cases (sometimes referred to as ingress/egress filtering). Packets with inaccurate information or packets coming from blocked IP ranges or on blocked ports are simply dropped. A filtering product can also interrupt the flow of data if the filter has access to all the traffic on a network (i.e. eavesdropping) and can insert its own traffic onto the network. For example, if a content filter is attached to a hub (rather than a switch), the content filter can "read" all the traffic traveling to and from the various hosts connected to the hub. Whenever it sees an undesirable TCP connection involving a local host and a remote machine, the filter can fire off a "HTTP redirect" to the local host (assuming the hosts are communicating via HTTP), and then send TCP RST or TCP FIN messages to both the local and remote machines, closing the connection. There are advantages and disadvantages to both these approaches. One advantage of inline filtering is that the inline product can prevent any and all communication to a "blacklisted" remote machine, as identified by an IP address. The inline product will simply drop the attempted connection, log it, and possibly serve a message to the offending local host indicating that the connection was not established. On the other hand, if the filtering process is slow, the filtering gateway might introduce transmission delays if it is forced to analyze large amounts of content on the fly. Inline filters are prone to become detrimental bottlenecks adversely effecting both good and bad data in transmission. Additionally, should an inline filter fail closed, then all access outside of the gateway would halt; this may or may not be desirable and should be considered in light of an organization's availability, security, and/or legal requirements. Eavesdropping filters do not suffer from the single point of failure / bottleneck problems that plague filtering gateways, but they come with their own limitations. First, eavesdropping filters require a network configuration such that all traffic can be viewed simultaneously from one point. A simple hub under the original conception of Ethernet traffic automatically provides the necessary functionality for complete eavesdropping. Unfortunately, most contemporary Ethernet networks are configured such that individual end hosts cannot eavesdrop on traffic that traverses a switched network. In some networks, it might pose a significant engineering challenge to configure the network so that one connection can eavesdrop on the entirety of the network traffic. Further, the eavesdropping filter's ability to terminate an unwanted connection relies on the correct behavior of the local and remote machines. If the local and remote machines ignore TCP resets and HTTP redirects, the malicious traffic can still enter the network and potentially the host. Even if the local and remote machines respond appropriately, an overloaded eavesdropping filter may react too slowly to prevent the transmission of a malicious payload. Finally, some protocols, such as UDP, cannot be filtered by an eavesdropping device. An eavesdropping filter has no way to halt the flow of connectionless traffic, like UDP because there is no connection to reset. Thus, the only way to filter UDP-type protocols is via an inline device. Hardware vs. Software Typically, people believe that hardware solutions are faster than their software counterparts. Often, hardware vendors rely on this belief to market their products and make claims about the limitations of software products. However, it is not necessarily the case that a hardware solution is faster than a comparative software solution. First, any software-based content-filtering solution is going to have to be installed on some hardware device. That device could be faster, or slow, or the same as any hardware solution. Also, software solutions are not tied to proprietary hardware which can result in lower up-front and long-term maintenance costs. Additionally, some of the software solutions allow spanning across hardware thus giving two advantages: better continuity of operations and capacity scalability. Should a hardware device fail or additional capacity be necessary, a cheap COTS PC can be purchased and put in place. Certainly, if a hardware device is specially manufactured to optimize the processing of high volumes of network traffic it is likely that such a hardware device would produce better results than a software solution installed on a generic machine. Unfortunately, hardware vendors may falsely imply that their hardware products are customized solutions designed specifically to handle the task of content filtering while they are, in reality, generic rack-mount PCs load with proprietary software that cannot be purchased separately. This misleads the purchaser into thinking that the hardware device differs from a standard PC when it does not. URL Categorization and Port blocking Many so-called "content-filtering" products are not actually "content-filters" in any strict sense of the phrase. They do not look at the contents of the payloads being transmitted. Rather, they perform "URL categorization-based filtering" to prevent machines from accessing specific web resources and "port blocking" to prevent machines from using certain services or protocols. URL categorization-based filtering relies on a database of URLs which associate every conceivable location on the internet with one or more subject-matter categories (as well as categories for sites that are known to serve malware). The filter checks the source or destination of the traffic to see if that source or destination was "blacklisted" or "white listed" and allows or blocks the traffic as necessary. For example, www.ebay.com might be categorized under online auction sites which might be a subcategory of shopping sites. If an administrator blocked online auction sites or shopping sites traffic to and from any site in the category, including www.ebay.com, would be filtered. As with URL categorization, port blocking relies on a small database of known protocols defined by the TCP/UDP ports over which they operate and, if applicable, the IP addresses they require to communicate. The filter merely blocks traffic to ports over which the banned services normally operate. There are several limitations to URL categorization and port blocking as a means to control content. First, a given URL must be in the database and correctly categorized for content filtering to function appropriately. If your networks users are the first users to visit a site with problematic content, your network becomes the guinea pig for the rest of the vendor's clients. Further, naive port blocking only looks at the standard ports over which a protocol communicates. Traffic can easily circumvent a port-blocking filter by running a protocol over an open non-standard port. On the other hand, port blocking on an eavesdropping filter provides a network-protecting mechanism that is unavailable on an inline filter because it can terminate unauthorized connections between machines on the same network. As a result eavesdropping filters provide some benefit that exceeds the typical functionality of the gateway firewall. This same effect can be achieved, though, if all machines internal to the network are running locked-down local firewalls, themselves. One oft-touted property of URL-categorizing filters is their ability to block spyware, adware, and other malware. It is important for a network administrator to understand how this filtering is accomplished by a pure URL categorizing filter. The filter, itself, is completely unaware of the contents of any payloads traversing its networks (except the limited file-type identification capabilities included in some filters.) The filter only looks to the URL source and destination of monitored traffic and compares it with a list of sites that are known to serve malware. If malware comes from an unknown, or that is labeled innocuous, the filter will not detect it, even if it is well-known malware. Signature-based Content Filtering One way to reduce the propagation of malware (including spyware and adware) is through the use of signature-based scanning techniques. In signature-based scanning, the scanning device relies on a vigilantly updated database of "signatures" of known malware. The database stores snippets of binary code sampled from known malware. The filter searches the payloads of all the packets it encounters to see if they match known malware patterns from the database. If the packet matches a known malware pattern, it is dropped. In some cases the database contains the "hash function output" of the known malware snippet rather than the malware code, itself, because hash function outputs are smaller and cannot be reversed to recover the original malware snippet. This reduces the size of the database and speeds the pattern-matching process. Unfortunately, when signature-based scanning is performed at the gateway, it reduces the gateway's throughput by increasing the latency of all the data scanned therein. (Any average computer user is aware of the time-intensiveness of traditional anti-virus scanning.) Signature-based scanning has the capacity to discover known malware, but may be unable to block novel code if the code is substantially different from previously encountered malware code. Whether a system administrator chooses to implement intensive signature-based scanning on her network will depend on the service needs of the network's users. In some cases, such as Aladdin's Esafe suite, network administrators can select which traffic to scan and invoke quality of service policies to ensure that timely traffic delivery preempts malware scanning for critical content. File Type Identification Some measure of security can be achieved by limiting the transmission of data into the network based on the type of file that is being transmitted. For example, many organizations do not need to allow video files, audio files, or executables to traverse their gateway during normal operations. (Obviously, exceptions exist.) A content filter can prevent such files from entering the network, thus preventing the unintentional installation of malware or the appropriation of illegal content. File-type identification is not an easy task. Naive file-type identification systems look only to the filename extension attached to most filenames. For example, ".doc" usually indicates that a file is a MS Word Document. Most computer systems have no means to prevent filenames from being changed arbitrarily (beyond a few limitations that apply to all filenames) so the mere presence of a ".doc" extension does not guarantee that the file is actually a MS Word Document. Merely relying on filename extension is easily circumvented. Better file type identification can be determined by analyzing the contents of the file itself. All Microsoft ".exe" files (executable programs) begin with the ASCII string "MZ". Similarly all compiled java programs begin with the hexadecimal string "CAFEBABE". Thus file type identification that looks beyond filename extensions alone can prevent simple circumvention by name change. The task of actually categorizing the characteristics of every conceivable file type is daunting, though, because identifying information within a file can be non-unique, it can change between different versions of a program, and is not standardized. Deployment Considerations Effectiveness The major motivations for installing content filters are 1) to prevent illegal material from entering the network; 2) to reduce access to legal non-work related materials; and 3) to prevent malware (including browser-based vulnerabilities) from adversely affecting machines on the network. The technologies' ability to solve the problems outlined varies. Content filtering via URL categorization (and to some extent, port blocking) stands a good chance of preventing known illegal material from entering the network, especially if the source of the illegal material is sufficiently well-publicized and has already been added to the URL database. Unfortunately, hosts serving illicit materials move frequently to avoid legal recourse and network blockage. Signature-based content filtering also has the capacity to block illegal material, but would require a company to amass huge amounts of illegal material and create signatures of it--an unlikely circumstance. Further, other kinds of filtering, including the use of "keywords" to search ASCII-encoded text, might help but can produce large numbers of false positives while failing to block some illegal content. Another motivation for introducing content-filtering products is to increase productivity by preventing users from accessing legal, non-work related web content. Some statistics suggest that more than 30% of internet usage in the workplace is not work-related. Certainly, blocking web sites prone to egregious offense (such as online gambling, online video games, etc.) has the capacity to prevent workplace problems and increase workplace productivity. Fundamentally, though, worker productivity is a management and personnel issue rather than a technology issue. Managers should be aware of what their subordinates are doing, what tasks their subordinates are responsible for, and how much time those tasks should take, so they are aware of whether their subordinates are working or not. Blocking internet traffic to expansive arrays of web sites, especially in an enterprise environment is more likely to produce a large volume of help-desk calls and complicate the content filter's rule set unnecessarily when increasing numbers of exceptions need to be carved out. Even if a content filter blocks all unnecessary internet content, the unproductive worker can always play solitaire or read the newspaper he/she brought to work. Malware, like illegal content, can be significantly reduced through the use of a content filter. As mentioned, URL categorization requires a database of known malware sites, so networks are vulnerable to any sites that are not yet in the database. (Also, network administrators are at the mercy of the update schedule of the product vendor. This is not a catastrophic issue, but it does reduce the capacity of the content filter to address new vulnerabilities.) Further, port blocking can be tricked by crafty malware that communicates via non-standard ports. Browser-based vulnerabilities present some unique challenges not found in other contexts. Whereas many attacks exploit unprotected machines running flawed (or unpatched) services, browser-based vulnerabilities exploit the internet-browsing software itself. For most networks, the port over which typical internet services run is always open. Thus, a firewall is a poor defense against browser-based vulnerabilities. URL categorization can prevent known attacks, and inline signature-based content filters can provide some protection from known exploits coming from novel sources. However, this protection against new sources comes at a significant performance cost to the network because the filter needs to scan the content before transmitting it to the final destination, potentially introducing latency into the network. In addition to signature-based filtering, one possible aid to the problem of browser-based vulnerabilities is better browsers. Legal Issues Emphasis on URL categorization could create some legal issues if a product or its implementation is deemed discriminatory in its filtering practices. For example, Websense allows an administrator to block sites based on various topics, including religion. If one chooses, one can delve further into the category hierarchy to block either "Traditional Religion," or "Non-Traditional Religions and Occult and Folklore." Unfortunately, the application of these categories appears inconsistent and arbitrary. While numbers of adherents do not strictly define "traditional" versus "non-traditional" religions, one major category of religious sites was categorized as non-traditional even though there are hundreds of thousands of self-described adherents in the United States alone. In addition, Websense allowed a connection to the web site of one religious organization while blocking a web site of former adherents who criticized that organization. It is easy to imagine an organization finding itself liable for discrimination if it enforces restrictions against one group of religions and not another. In addition, content filters allow administrators to apply content filtering rule-sets to particular machines, groups of machines, or the entire network. This could cause problems if the blockages are enforced unequally within the organization. While categorization of popular online gambling sites and other clear "time-wasters" can increase productivity and blocking known sources of malware can protect the network from infection, blocking content in a discriminatory fashion, especially content related to people's beliefs or political opinions, raises serious legal concerns and has the potential to damage workplace functionality. False Positives / False Negatives One major danger in using URL categorization is the occurrence of false positives and false negatives. A false negative condition would allow malicious, illegal, or other undesired content to traverse the network while a false positive condition would block work-related, mission-critical, innocuous, or allowed-but-controversial content. The self-proclaimed 97% accuracy of one product would be sufficient to create numerous help desk calls and retard productivity in a large environment with significant traffic. In addition, the granularity of the filtering system may be too coarse to safely distinguish between safe web site about a controversial topic and unsafe web sites involved in the controversy. Websites that host security related information for security professionals, including snippets of malicious code, links to compromised sites, and descriptions of vulnerabilities might also be blocked (especially when using keyword or relational blocking) reducing the ability of legitimate researchers and system administrators to access necessary resources. Solutions Considered CPSecure Content Security Gateway (CSG) Series CPSecure's CSG Series is a line of hardware-only appliances that offer what the company refers to as stream-based filtering. They claim that stream-based filtering introduces less latency into the flow of network traffic than other types of filtering. The hardware device resides inline, at the gateway, behind a firewall but in front of an organization's hosts. Unfortunately, we were unable to procure a test unit and cannot verify the company's claims. We could not find any literature describing specifically how the "stream" scanning works from a technical standpoint, so its effectiveness is not known. The device appears to be signature based and receives daily updates like most other products in its class. Its distinguishing feature seems to be its reduced latency compared to similar products. CPSecure does not advertise the cost of its products but its licenses include support for an unlimited number of users. Blue Coat WebFilter This hardware product is a URL inspector that watches for known bad sites. Its URL database is grouped into numerous categories and supports the creation of custom categories. Setting this product apart from other URL-filtering products is its "Dynamic Real-Time Rating (DRTR"!)" algorithm that analyzes web page content in real time for any site that is not in the database. WebFilter evaluates an uncategorized site based on vocabulary contained within it and the categories of known sites to which it links. We could not determine the accuracy of WebFilters algorithm, its ease-of-use, and its resistance to false positives. Administrators should consider whether simple URL filtering alone is sufficient to protect against malware- based threats against their network. Database updates are loaded "on demand." With malware, spyware, and phishing, WebFilter does offer some smart options: disallowing access to sites with expired or mismatched SSL certificates, warning users about data entry on sites with questionable URLs, and recognizing well-known spyware or botnet-like activity. Blue Coat does not advertise its prices or licensing terms. WebSense WebSense is a software solution and requires a Windows machine with considerable computational and disk resources. It performs URL categorization, port blocking, and nave file-type identification. It offers the ability to add machines for increased scanning and storage capacity which demonstrates some scalability and redundancy. Also, it runs on COTS hardware and operating systems. The current version of WebSense does not perform signature-based scanning or binary file-type identification and cannot block connectionless traffic, like UDP, because it is an eavesdropping solution rather than an inline solution. SurfControl Web Filter SurfControl's software solution not only monitors and filters but also offers reporting capabilities so that a high-level diagnosis of recent usage trends and activity can be more readily identified. Additionally, it interfaces with numerous third party products: Windows, Microsoft ISA, Novell BorderManager, Blue Coat ProxySG proxy devices, CheckPoint firewalls, Citrix presentation server, Cisco CE and PIX, and Juniper Networks' firewall and VPN products. Using both host- and network-based monitoring, this distributed (not inline) solution allows for custom policy rule sets to be implemented. Web Filter has threat databases for instant messaging (IM), peer-to-peer (P2P), spyware, and games respectively. SurfControl does not advertise prices or licensing terms, but the web site notes that a license to use one of SurfControl's products covers the use of any of its other products. 8e6 R3000 This Redhat Linux-based hardware appliance uses an eavesdropping method of content filtering similar to that of WebSense. It can also operate as a router and firewall, and claims to filter based on type of traffic protocol, source, or file type. Licenses are volume priced per user with 1, 2, or 3 year subscriptions available. 8e6 does not advertise prices. We were unable to procure a device for testing, but a survey of the whitepapers published by 8e6 suggests that the R3000 performs basic URL categorization filtering, port blocking, and nave file-type identification. One of its white papers claimed that pass-by technology (what we refer to as eavesdrop filtering) is only available as a hardware solution, but this claim is patently false. WebSense seems to perform substantially similar functions to the 8e6 products but is software rather than hardware. Alternate Solutions Both content filtering solutions and IPS devices watch internet traffic and attempt to block or filter unwanted content. However, their goals, targets, and methodologies differ in significant ways. Many content filters attempt to prevent or limit access to information based on a predetermined categories. Thus, these content filters only address the category under which a given URL has been placed rather than actual content contained within a transmission. For this reason, traditional content filters are poorly suited for detecting and preventing malware from novel sources. Most vendors exaggerate the malware-defense capability of their products which are at best successful at stopping non-novel sources of malware. They can not recognize the malware itself nor can they recognize new sources of distribution unless they use a signature-based scanning technique. IPS systems and some sophisticated content filters address this problem directly, inspecting each packet that comes across the line for malware or other malicious activity. This deep packet inspection requires much higher computational resources; thus, these machines are typically much more expensive than simpler content filters. The advantage of IPSs is that they can remain entirely ignorant of the source or destination of the content. Further, many devices in this class incorporate intimate knowledge of protocols and their respective weaknesses, allowing recognition of disguises such as TCP fragmentation attacks and reordering. Devices capable of TCP stream reassembly and sequence reordering are also available. As a class, IPS devices are more intelligent and capable of effectively targeting and stifling transmission of malware and malicious code than are content filters because they are designed to identify the malware itself, rather than make a determination based on the source of the content. Thus, administrators using IPS devices instead of content filters will also spend less time carving out exceptions to their filtering policies. Unfortunately, this sophistication comes with a significant financial cost. First, while content filters can be obtained for under $1000, IPS devices often cost tens of thousands of dollars. Second, if an IPS malfunctions, it might be much more difficult to diagnose and remedy the source of the failure due to increased complexity. Finally, administrators must place IPSs inline creating a potential for bottlenecks and introducing design problems for complicated networks with multiple connections to the internet. In general, a sufficiently large and well-funded organization could benefit from using both categorizing content filters to quickly eliminate connections to sites that are known to be bad while using an IPS to dissect the payloads of unblocked traffic into and out of the network.     PAGE  PAGE 10  5=Vc9d9fwmw0JmHnHu0J j0JU65CJ(aJ( jU 56789:;<=WdxyzH U V  T^T` $T^T`a$$a$V b c 67#$?@ #o%p%&&&U*V*,,,--0 0D1E1p2q24466d999<<=="@;@<@*B+BEDFDFFFFFSHTHKKXPYPRRKVXVYVZZ/]Q]R]W_X_aaaaleeehhhkkk~mmmooqq qrrtttxxzz||~~<^<<^<&`#$ 2 001h/ =!"#$% aDd,TV  C 2Asewpsc_imagebJ\_өXDnJ\_өXPNG  IHDRlV]PLTEfzڇ0}2gₜԔ͂;CGK%$%R{>rۤݪҌyŵ*Zjrךfka"r]n7^u_/Y{'߼m;eLÞ=\Þ= 4Ӽ|4/|aT26ó5dds&$OvP&۔2G2فD$1ٽs]2 2Z&dAKM.gtŃx$d[Fe &aM6lyɮ&E۶q3 2~&9̊HyMڙd[d͑k|A'9bML( -g,`lQGrׄMq+19l &.OYgF%%bD"FUm zʾ5k|l6 j1@u5_Nw&}w\骵Z?[ dԚf] LĒ4Z⾨)]5qk[YO'bJlff ̉T*mH[u=Y&ˠ9ٚP‰&J{8,vbeg md~O`qLM9}zHfԗp& >F?}25̡AK'X: lʠ>&;dzd-;i\aHlq2j|td{ H6q,Sʭ.OllF_ܲ' LNKwmh|<QAtl Yl=7heO'x#ko{ܬo/}cOd2Zl069S$Qkq:\qwws">lBa/b9iOM}yقX0}1$6m;sH_*m8o{DɷNmЏ,}^H~5S{Aw7[Hbju{t2`5;QYtj <1SQ#/D#<8md[r-ZJw`0a",kl=5hv%C+ͨ#gAyrը^69+aD&W烡@3!GإMlؒ'&:i3[<sM.{dd9H~dY6q5]`6cY[/l6+ɯY|sbjY&St$}FRk-RqՇH0 Lr6ȈdG Fs<vFCH>JGe }.th"sXd+s4Z2אu5Q!dV$/oȇA%3IX4:/:?oj1s+]t02z>-n'#lY?8?]OX{x_ ʛkb}<_S*Vvp_ˆO|5$],o 3uސt}|m-6SMm#Snk"gFIM|gm>ʶk=AT}>L䛢y<+Nj (8o7vlݑOg5 o_i ׈fc_G4ɷ䣠jb3њL >`$߷2dƆ=|*ΉjHS+#~!&Ύ^11 zƶS"Nɑkd`$] [.vhPcAIvSL+p3ܢPsi511jk]QDZ& jF}ʨ-E)[~*:iC}X{j1&'aSF{wFv/&-hX6w,"*L`S_~{ڏmK|g7?۷]&wٸ#lHb$$r[ɸ2kreüp| FZvAlNbMLٕ'‰QO^H@XiN֘5IOܖ6xΚhۈKG,St91d4^QI|kq8ygV>,@62Dl N1RsZt~Ҙx}+sTBFt8YJ,+hM(o D{skW@j@-v\у" 1[gcf0ywl% c7Re3n pL`NMf#pF;ln>\ [ۇ6F-Ժe$+"vItFR$?Z0x8| 2!J睑l&nk>W36Vp"DW F|\ $?wk1"k=WBr' gx6 `KB5A|]XĜ4 ;#Z๑|LnBֈeوf$Q  9lG"t>^6'=l& X\1eD#FcWk}`Qh:G,##0%2+ خ' &fKٔZla2An4`0 A6lΚ|$H1ٲW d:B&;Ȉ=2ǥ!Yf{R/hoNNDŕ;m۶:]6 2?)wa۞7ig^Kg_xXbiXlg3<;>&|ga3 q2e-lLΏevu?e{<['v^6n宪\ t7X㡘?pHd9c[p |/',vX d:o<'ZS)7ϡ+ ;ƭGl{yDNX(qQ98-Vnm٨<5b3[&b-|N&ۯM6mI><n[[2)wّDZ-rbiHp"''wK<#/X\VC&kCL&ۯO6XLvB>c$#x";2<#sXq;n13:֐p5I\Կ@"CPnDg(Y摮% Y v.j‹% \$2[=x~̃{KGV?{#_c+@tY&RXmr"p:Q 'jIaWzuk­Չ,pƧD(S5/:Qd`%'ѣeEuhz‰ /uD&%smHR?,`KNԢDFX5."x~k5JF䭮.98>Jc<ЗΡx魋Sís(S:^4e qϠc|߿X|?W~6*]3o=+ow_ؾk͒ &<H/ywGF-N,8R0tnt5hX* gR:c6 cE@kVq8 |,c|M/71Bو+`!>'">#P@EcQ3cwqNhFrfO /1Ґ5rcG5;̚[g9LF} MzfgipكWsnWa4wq<A,`c \pENGbFYK:0j-*lsJĔ/N_MvLl^ls-NwhǮeÞ(`LvI?'%x^mܨce_]EY }-Ƌ*{#Jח<\.9 Dv,?"t#}IS*bǵ  67$dIIjUx@e%}^S8lOm6_P@utT~jg?0CBPܦ`$`_m>Y91hvK4Kl:dM)J)Rx%aftˉ(-!|Sb ĐNp(>?51>Drnh 5~cD$+(؎ŚHɉ}btbS'U$fv_4bX9 v8  9,Ad@5FY#tD#c*x&&%f6x؆Lք촁1vY0DlSx(Xj؎:Ր|TrHK~Oo )H޼^$˟H!g>a*([qGȽX<+d"g2d&_0ɹjPt^ "0 3WǮZ@!O(wk.)fjr] acv +A5x HB%ȭٚwF嵑n_"YZ_>2"> M*|1ys^sH7v>&[;_u_zW_w~PgLuy_Qmsxق%jrG"YwE8G}cE9Sʓq~]$ԡDJsGw͵ב~oߗ N| GD9fiOٿ@Qtz[&5kwhstߚqAl &kwOP}t|dRk<}Jd=}V-pפ& ŲjV+>A4c ~!M)Lh=q< EIg@v_Łlnr4>Dz.Qi!ȹm~͆ &ZuɎ\F_ґ"0CcMA8ӊ20 *- J grf<u3٭~nm x|X&[?R6:Ye]f=j*}@Z\.|%|iU=hKsg?_,./?N]o%!X1 +#7ߨ>Wouӡ?(JNWosy _p1Tx6 >7?~ W~G_XJ?W x~MJF4o6R{)_TrPHf$C0̸rMʛ~s}O18.G򗿻;9֣nv"w$o}Z4<_9U\ YJR&tuk ++m<\ L!BL6P/MhVQv!h\C(|!7fne翈Yp5CeFII4[0B;׶ʴ>2?2 5gӣd|Bz\uϿİN0".2Nd}td ?roꔬ|jBHEf 2+dX\ Rͯt* q1VY!t |0 |:hx~؛ fmԒ~wgu ϵ1郟a*,k.<ƙ5wԲg7*0qUj WX+  hp5s׺dIa9S%B / <_j.maQttkϗ(<|'|+k|hotPG3%[zћͺh_uf{X;h ̓MKӿD} |Ssz9W}~~ʭiɬt~?NH/#d)ko?M(aawe7zDk|dNmO$o- ӧoc#W?.|HJٍZWy{BpNѣL x~Z#(K#BF:pTI~H5/_Vd4T+|.r.S62R{@dQf='ww\E&GVEre|ɆY~%Fj!qk<b6?,6VM$0E<[ŝO?@Wx~i qH/un?9E, $xϲcU>C=pN6VWW?ɣ{k 3YuIr7LG{Letv~)X=n+ GO[1iU~j9Y-]bʯ9}|Oܤt>x>owg{v`<Rx~:I2Mt)k]INٚ$:=p r4L[Fl)x~Wۯ|`n_k3q%nr\V9Y Lܶ)耔@lb=d|Q}0]dD&g<(x~$xF|nau\^bͻ]FF$ל.5M61\ʒ n,C_ΟFIj1lߋ|G8ٜ7<+jh9YWh9Y2J*pb->/NL6\M=6ϔeCο6gb ?ߋB@l:;ZMUxyhf0~M ~Y #m!STSe|lS3H:3~f_P=\] 7 x]WhxpHH9e!\eAٴ EJ / B_>l"7& tAP,0J[A*oq:oˑ5[\Wr[,{tB#7셱_6;i86X@T7z,4Yo+'zR8pvlI!Q .`Vx>'*+?881!u:CL6769ݭ F[e[IJ֦ik958H%`﹥ݻ^'@.X;:~@X+5ƚ5D]|^8n1F( lA+<_^S  j`#-/=`:x~O1_͚ ЭF s"`j'g(ORhdc=)s8l)T.2#*۳ӚYϯmW@[)/b7x/0µϷ/Y}n|_㟙Co>W} B_gu+CY\M8`Ib{~'Y?G#A?H?]>-:d5kNh'fbW[يYQ5n)2װ$?70z9bvWAH~F#$z̓DSB;"VmP4ӐY~2!b=`O,κWR@v`L*78{azH:ӄ*2R]dEyaЅ"(U|+0`r"6 N<ʇC2g (R3V1s.qwy&bA{Z̹L)Vy0e F=SFFV A'ب<$@3vVr)䂤R6`LBRhCMs(MjA[gͩU,'# exoxb-amȀIڐU7~_W/^ݮ<_/!Xg?\?S6~3E2v} 9-@ _wTۡ xJR9xU?Fb U^(#Y@Ey1յg& +Fpa&k):J@ u¼(g)e8S2@2|rh/-Q 1ѝrkQՊǨpTULCB\U0sr"V‰WQrkOn]RxFmIUP 60hgP] ?wY9(<Fwj΁_RduO2"$C{k-B 0}!C;Q#3$FB"QUp-J.N,bq խUX.˅T&鬜\z9WOio:KbHI'UGc;~sX*vω130L+`5aըDN9z<. |K(0xѓZD>)."]}T|^t~[{!;'`^'/:b'S+' [dԚrGuLS57 ͬ>Ɖ\pQ멝'{%j&&tw]m#L[vN5:c%?y*jL0 O4'֋(L^`NM^cn!8jQ%` zwf١ȣ?FNW; WIqFn=ɶQs<(qZl(JXmmc<Pwvĉ);4[Xe &:o'#('F|2p5ī/^قWl}3[jθzP保&wJq"JՄ=wi7~WS]6_.E3_F\Mᨆՙ#|||~~:vR{Ywཀ!.ےn(ٲyAFEjTPZ_N L Lljß}nVر ׸~2۽;9a?fQ"0ְgH ' J͛;歀Lm~ j-b Ef~lAQ !r5 UhCǟ@W$tSPael:b,FoDM Pc .V[Kj\Txl7p ~ ]D|wlܲcT.M&K-UGs5G^jj E%Lݣ_+pwٍ5rW,߰矊` "j/H%*aYWʟ]{qiY_9WyկKJ)`Msb {̓>R UoWlCv\mH Cps&uCoyICq2aQi l$q"k"ot}mB^cX3ZXʤQi*txF*~~&w}]O,'nonD4Q|tҮ$Mm 5&^[bJn yo ֫qN/Ln"]{zN=z[,cmI~O[I' m>]0 H_QלaK %|  ea|d?c^xnQGt-1JAnɶ+s,!/=Fv~>:?ʑ,Ulw|߻O _CxVijqN3"[LtDT q7_r/_ANnн|}K";O`O29}zŠU(ACv\s1R<_d]ÞG)[uFW yJ%I8[?kI )MѲQS H`aAKu>N@V\>=2rn/Ե@iNV$hgpVU$am +dnꈪ45dy͚mMaGNu+d<8H2#ގ8|J9d9Y,JJ9Y&dP@ ~G)[Ɖ|Lmz/ךF;+GXhYkl +C7'߮=zϗ[/̇mzbժ>n X#[zQ-73z'Tƶ@d׶Lڮ+#['߰X}4pێ[ľ_|K'&-5& GOЕ]n^k x+iNfKN@?dT[k!u)m.utT:yTcyB8 Z=1f>噥4fܯK~ RZN(HcBO)q$ .uV F;tH`+W~{p2`pr<ĻEGg7v.m<܁y00[+G]~իl۪B-s'+$J5e9 8G(Α>lτ[y †p5{Q*TL U7dX\}fs("^xfG~]1ƩUkIYkbp=}cT_糟737n8'P|'>r1RӃiRkpWC3qgRq[bN9M5̿/A&b&E{PÄhX־ QZGlk\AIOdca\TaŰX1;5zf]Em*`BS"D2B1SmbHO7L:H pOUvi|>i%Vւk;glH >;`1To:_UۧOuE- ]x+g@tRLܦ6Wa},T?5Q e[zܟ[v9ϧo( =}fK==*Ki _-5PgQP6*ui~9=^#X"55i/ mRBކ878+xT̒t%@C?Ov$#E?`远NQ,LDBtk/oSlx/VW9?^;"?Uc=WQC̔ZjM1M(qer蘬g{?_$cr?Zղ-7_I2QO 1Tk2w}iMR93x" ڑ1FZG~=}2 WDSg^;k]V+ot4Vukr@>V+eŲTx-G3,rme$[Ġ~rLxe ?kxzW>_޾s^oۿ}~ _ժS?=/_q+9z(PrkG{Qox?҆WO}UOʱ?zcn^ՏeBÖI׊7l; '%'KBq?!nH((h3ڊ6(=doT0zCMdanmo8%Fp7QT[cMzKXˑY)'mHb:]ro$ GTeY*kp<e0,nY6.^WiTq~ЮV <.o-y2r5 h2UdsP:䓿]Tސ-@ 4 mXjHg#Q&QB9t~#'a:G|7o&!E|CUen~՛W_Ƅ7 96o/'::^7_u\VP]!C?{=/^z➟?os=/}{>OmX7T_s& gB|1ÅߛtH5ٕ 5hOJ 1x72#[G$D2TX=U!q-:k1]j~w4lU zսl.X;z3((%_jQt>9=Q" TPu=Ͻԃ=~^3w>?KoxmyF}5{jP2%gސ4*TFC^  x;nzut{w/3,F,yx:ϚIh?}!`4P6=e8K-5ɌX<`Xl;@Ca2:< ѲUI hkA8eFȳwpyl|脎^3x|%x~, o;ȭ؎$WX0oUCdK Lh;TZR??ְZkӭ~Y "}0t*,,?:OB漥owR!+WJ@Z&}RKpwxӉX& SO\s6e٣߱5 $߫)e =SJ=S2+1Ҟ)W/۱KO_sFo~[ڽlWy{D}wҺGO?u_ַ~yǙ|W9W7>_?]zv#>,~rǾKzWϽ䆛^m羼K߀PoG"YIygGo;ZE׆Sx6og[pH֠luP2TeYePD­"%1΢,ZT30RHVoLƄ1yo!@"`kޞg/RGdIxjZ>~iV/CE.hZL`LzM^JiUxYZ2~-α@I|jSkjKA95[|/x H^h] ǃd }hfĥ2Z爆e[+q]7g>tݏ|+/6/x.ky~}?ɚ`C?X?ߒLO҆{8X^蹷I5/vJx}9bJ=1^5&xϏj|P6°9IŦ,D坵-\qKHOj%}Udor0ඥ|r9 {<_a67zSzUƭRDa/0P'%Fv~7h{R<$n4šd:8RiCX%v>_ p" Zn8~(v_ㆫ ֜3yweb˅o|&klNj Iq~ ہi 4hKn-.ӧ_vS~t@ eTdL+1&$F!x><K;X;y)~?/<o&B,Zv$\.oH fi}=~#ЂZOo2ķƍv[bmDCg# F%#,pfkq m C$}txs)}9tM+0V>GӧۤҔPQ4%y}R}(ۑV]XQ΢\ u4<_tB`hPS6Ω[.llDC6NLJk)=S x3X;CIBzF V@:CB:GP/vV֘FfRdkh#x=a`Fg xPb"küuZ{̝DG̟ڿoyeORPWӘLOåsuOop>:":_4koNqM+]C2<k$NOY\X̀[|9(:c͟4ev9d+`7+o)=Fw9#߮06>Yr+ʾ>#^|ǯ}/iFBBvJ 7$LCNO*gGB(@Q+~ p91Qo <90'";+m 1ĭPqCJW*kG#Y_ѩELlGjEĒޝzBaP'K2 aX@5og{OxC)՘3(g9/j&]wBN6#,}7^UjF |A rSVT¸-E(HP82uF'뒯ȀnO6D|n:^Lh:a}Wm?z߇iq5~K<,)b >S}UTڡE+>#HgiSI6BiyNuV\NrWS"5Z(<ߜU9dOk-YAKqT`.Y Z*>`*Z ٣jH*J:A돠_QN5th?]u)΀E|coADfP&6 OKjBݰM6D$ <[/|*%:_o>A8MӒ:ZQ/` d +.| _CBhD%Krk1\LQ]aq%3Ex-:Q*55jdE|9\{zokiVG懩r%?JBCvM_~Uᗪ/wi#+rئI×IjNvQZ'hZ//tj fov~uQ}I'葘;^? ζ(c2uϐ6J6X/Я^NX^~o'hU~zz<•i%qxDm<"C8:A \M)q :A;[UDhtnht5 nVu]@կ90yC;GʠunexZ'lp,;Ko:AG =A;4xQ̾||y.4afN ɺk~O4hnpNG: Zgq'hʷH'h䠺reА! zw:Ȍ>I *ի~" ՗(s!iߒ,́6m4Thz^$ϕNm@N) !uTs %h;A?@ :|`} ߕE戅 r=FʄN> t3$HZyFN\Q0V/LWFnhSvy}OMpETQq5 NE~>x?_դ}:rzV4S)%7v Ii+2W=NФ|L+'Cao}l )ռ֥x`:'Nw9>{tT%% T&AڄtM1KfV)^)1o"3b/X7~EKGZ|F#O]zgCܿqGY;o \U91]5=)tfC5κ(:A{(G޽*"8P`KrhbŪ#zV.8*qb^:A-$tNВbІ?iu" LIaixWR@$=)He02'_><7Oolx4 QTQ5Z7ZɼBюjKZN^<RcP<טH'h:WL?u@ݾ2jpdIі2Olp)r,ybhZ;-v.z(?sD˺Gj־'{sMcRiåRƟN].UWA tkNo'"JSZՑU|{_[2N: UGT'hiLK:Af]$v|b?ǨE >o[=qesW%-Mۛ}G7[wM@[tO&u|2IdAT] ֫K>JSm0N=:L Qv*MN%ZoQ:AK t=R`yق<'ц5`81Ljtw\Ih݂NЊ"xQ6K ˒BDcIdGnГ//U&^Pxtvb pZ%U߮F=mɧNEK'!tG"Y|أEBtXm }͚ZY #J:A'V.R[2+H̿{ oV_rl|Z})ll+n-\Տ~/n4FԐ9:A; 寣\ؿiDRtDT'klI{o'LMG>/۪N%_Q.}Ь)tۺu :/A\|OXt~n+Xú߀De- I▔*tRY P#+ )^l FZoN :iQ1TSx_:A,awtwཝ]:ȾR-]kJw4lzq!s%?<Лﭴƚ pƒњj0W8D'hMUA+mNЪ+NZZ:ASM&Up.Hu,o%7ξm3SsxdU{lOȵksusqng=/JEIɾMBl mij\ ZnB\ܳG.Ȼsk֓nD &H'Q$;:@q"X}@>mV /||>a*\"D 쟼|J3B`֖z"ec-/譅'FQ #DB%z%s"W-lm6.xz#[8Lt}݂$_GMn={# 脵k9l" vx)RVWS$ BWK!ɞScsN>R<_Mn-HoM ¿b PSqQ'o,oL>ϲXH~x7y_OY6إK *Σ|q ́T'hEM4ᒪ 7~4("jQ JJ'hw7;tFuLvi`:AӴ0G\JS_'<~ ڪz߹^,6/"r0P?+ۻ'hB-K|z#J'訤$|:A[] p/J~рדS-%=GJ'hwtI_ y~"zU&n;L8yp}^X냱ZFh{wt['9)DֽbK;Aks(#.ݼ z]?Ww%__,<~.^y챧nc}Wek_oQI?K=u>RzciNЋP~dU.|ALڲ.Q="V0ڹ=<B}қP+Ҕ +<_ktPl`f*vޢU!ܰG\tD 0/YqjzbK|҂AZ`zۑ$ pWT*t^2zHсb,z5X*{ރkUCdTo]-ݼ KE5H=x05 0RJZ4=mF`Yѵ\ɿg?O~~擿mzb7 _{ [jSslSU}ח]ZWL~|/]Fڹ(\gNcV t ϧ96p!aT-|9ULL-9ɧs ~W&Tk:A_Gwv}WiLa0ɉ481x%/دb*جhЙGb1Rd/(}2" S&e2S}d{t IoBjB&Z9*uיb'hiZB]xNP:.v&;}o#)r0]wϽ䯽yZQwTs Nj/)kES'J:A#H8ZV[De 靠u#(HZ'RutBFTe/XTz~o]NK;ArPaBEi`L:L3Zߺ0U.C]үJ:A/ &)Z$K*MCP\'u+4mMQ (M(] zҺ :I'h8A輏rѹy]9o^:A/vbL]QԧOl"^WSVLyjRǚPKٗX؉yBK~/۹{'NWu,*Ak?']/tҔڶm Z/MEN~#$f;Ai`jB];AgY`O[۬tI'~t{rHݣ:A۵N@w蝼һ5^ :Qo[WzFi͹sD8%FzeTMM Х"]QTOtN:Aۥ-mW\X?a$bt:A߶3<C p)M&(Ӯ:G u^)]AM'%x3IF&ڶr;*'z;4uVBuVx Z4\R~tPi*Jh T)tכ{b}Y&Zve0>Yn?t$G#SPwcOXO^p~YVPVUx-: J;ACW\Mi S쁮UTx\6mkzݔ@|:AӚSPD' NЊ@1)CZjFN<_.5AS*ltJC{ȬDCv .KY`ti'h*tȁZ TˢIQ2ST$h>Yc%FJSK;A esPztzQ' _ %+]CiS3 ϛ/'MZCJGEN!,W:G҄'+uTH;ASiJ iSœ'k"S%Uc :u&[uF߭. (Luo@'hU'h󋝠tWG`e/.q([6w|#gM>Χ'7x8[kD>s?7oV_C?gX گ5sv%+́6D%!́ØXė' ӏS5αhCekBiN>fx4⥝NЎN$]ߵʅNL{:A ftnxlZ-X?jtf["PbS>77(:=ХWƦm͏c}\'/L>bŹ^?w'D'3O' 47 +^ 48́V<4Eh4ج5xJѓ8Eq4|ᨯp"$&; uͪ}sez]f' rqxoj-qXvr Ί;q%O*É|G-B'>C"G2 uŧ\=h% $w䊤!5lMf ^ <99QP) ::46pzj'Xc]:AӗY\,zuҡ>97(M V&[u>J'h20Pɞe:3l폧}9,8sh"@: ۑ{ %BAZ.pU%$õ r/=l+O}-v|f߾e=̝s_? /U?ܞ;A+͢wֻ.ņ9p2~gըZ~\ҎY{ۥ]ƗV;#ȇ7h?'xA-B6(} .+Eleϥ#[s(o#NT4:Ak&D:ukRPIҁKg6Z0]O,tN,mM­9\:RZ&D~",X >nXZܡ߸{ZwEu)ĖH|Z1i$x& WO.ǟx;OͿ^\\ "o}`a\WUo ltRNЪ90}9:]INZs[l Uun90A><5U.¨pEME8+Y)fѯb ӣYuN WގډN<m N%>QLw&nI'eAݣ(N .  tTo C13j*tz4;A3wꝠCd>zNY:AtfUu띠Đ\yi'9i&Ҵ[9N, Iq%ÐǟpsH9{[3r./B?Û_GXͿ4")>#ulw `rK?Zl *#\Q4(۪֒;hQNЂ{<_v|Zt{S<\v}j&PrWm+hsD0 /}U /Nc`;K:AWTMl Nnلh*F Xx?p% $9c;K;Ad=Y.18TQBfy݅NЂ:AGX'hs#\~@523iU4Mrri'h:tVMty;\ai'h ZbWP pObxQ3鹫 w7DZ0=jjO<ΚU ljK:A WSzHC:ANJ'ht~[E Bz'h,7 z@u)vz;.7I$?Vw['hNV@/arAvxoRKX$z~O~}﫾?O$Ǖ.<\7ai|N<\PMCWV *ɀ1i l4Q|(Htݰ :>u4z=PB)\k Z˓IiLQH:A{g1U :C`J \e$UNZ)0PDSstĭ}_\M_Y vZaޥ _UoG1յҷYWV  2s,~DjG/ʵN rNNvX~iͮ=\PJKզmtb!xpQt~:h`kxR7(o ?R}mM6k) gP+|'wOkcT {\ * j<WV}~R]/,|.jqIPqE!0YLqM6G:+$'M<\:Gi%C{&F&G_6H?"?ߗACYm̢H!lsr"@tthD:A("Yq%ϯMq"D/"fr.ws))Y5ZNl0md-y>9|R&)2m&c"ϷLABs6ƅ3J8@um0h$,-ͫ&{XxW]L &](|O&bn~~i&;$G&W; seۮP].GrEHmq oNzGnp? ʖ YjotKf)`]#9Q/&7Wo<!?Yqt#eU+5~~iWت%f&.`o,"xR*5 U+%MiR35<;#pˉA'$qgUuҦp4kM.]Arض!ziL")23$W0=4B$*%H=C{;YT}Vu>K"YFN-? lI'h }8tvI/ /i5J$&0a(tP>hev|=U'ht+,yiK-/V:Askm"2J'熱F A/%tWW*iLW)dY2\r"s<_aRP|NoRWΥ5]? <`]u@ɱr{rP+?G/M-jZ½'_W? ᕯ~''߮hə/~<# ݂Jӏ, Uu]/]ŗD֎/|Ųйt'ן\M 0? #P_s8k+O!۫_W ҋU\ɗ!/,ʺ-w: kZf:; ZY޹jv>浫WxW:/'xc+6T@y)dx9d=4}e7kˊn|s׷ 8Rdv3UT<]"T^7jƖ+6132dXfDf{w777;i T5.]9޾(Wv76V=pvV?o,SS(buc6!NC"왇;:!ޜ!ܲJiu8ojÝ4\uH>655o,T}x܊ssڦ7O"8ʦc+VlC 琋MWtΰ' 1ˆ Feiq{|p%ˣ{feӾF}5׉,5Սǂ]q48ddކ*;QuC3UGf*;;NG TU?/JseuU-U7+Tm6>81>T>Ŀuٻ[dP%J)v3lFBjFi9wyH+X%`Aױ}ܳLYY7r̴4Սmo9vplxGY+wl8^bwV!L I^+Jth{QwydBeF!zyB8 6*FIT_uᎽR3[&v,[e|_! Q,H,f߾V쵺!F _ro@_dO{]uCLK]:"}s]\K}/>՛7_pqEU_َ+;ΜaoƱ`E !3[*٩Y%2G= _ CSttglZԎN~nDWwM#lt|bkW˘ym,iw$͇/c"5g![0s#jk(peU33Cݍ$l7Q_?l:(Sx* n M7ϰrS!jjGho*ǏDtɯG?͈N.(Gg*ٵ65c+0+\wUW`ʇ<xǙDfW5obn*߅baEF6 U)oNuƔ``Sclc-̮,J=V7VW=f;`]e!膙źNP9ˇMa+<^qΟ}Lg˱M"LwhK5$!bXQ6ٱL*c+vi69|ngQf=$}bFgԵԳ::vxsAPՖ;G۱nwa!~ss iǎuej CVN7)o>.Ogl@cgǙu{xG-2]eg=?QVYRykGM%\Utd^֎#bL45fjoFx}`Z#J8 &Ty#ʒ٬&ignXJt8pU !s, g߂ wK$<=ޱl˨؈Dif/# ;PS+Wocˮ#iMYL3U9tܦi^M= P<X8.Xo0떕UXzHp#Khnek'h/߄T.ZfnAGD1z3ܞ auud~ [j$ئrE5Mi4f?tVU3▊}R]5 'bJoY1Dtw)4n1)`VÖ9crA 0 /Aef:f1eHt7`LTWkxl;2܉p6vYU9~qW8^!͹]VljQسi6O%T >dٙȳ{q\؏+:q*_ZyKs%;bsr0wxe_>OWrrqtƹcL;yiLsZ02gQֈ@ G8- ފf,ad<);h.8/(*A} &O7 ~%P~(d,#U[*[c}8u99pv9ml8::rǰU7Wqq83BYWvY9ucM9RcB=RiXWƊb}Y&Kzey#{2Nn61i߱*$f02ґ>+ջ\]7\{K;K).;s OT֎8ǎ;|i'{[!JIa$XUO+;1۫fH`282>"DLdYuǻqJfP6uW#5u47*UQ3&[R=Z>ҍt2p.[ܾF.;8<=`}M6jLQ=1q>˱&g rUVSoa j䭣bWUհ0[ F/siŦfA%gmQwuoFpَe(cq<12*ˑ1n͒؋F Acǁub3 w ʉM0kgk:Rn/x׉gV w`bZJoiقMN]0Bk*Q֭/[ۑNz 7Yz-CRx3'ڈ,{% vPF_'Q<ƈ$-Ns88VcgKkFpAlYuYUqgx H)8k12YehoME)Ipq}.[^Vv,S\s]gNEYsfbNVn3,OS9;밥JCcqf,[n bFLĤm)׽Do*ؖNP,c'ET_vTGfM=TUCmgE[| =/GX|ǪϡG8!_X.XAznq7qKlX;jYw6oRGéoR?ӺlK𝪡:)v-ux=WcM8|Dk+'CnsU:A5fX`Xy Ő iլCMfa&qrˆ50vL?vˌ cL9S ^<}\-;NcO}xCbgVУsWP`#qElGtK`EwǮnm\9:.C|u4:+>MrBpMcG\=X7:{ܦJ kh9}hN @*11+9t ̖+H6p2Q`GF{Wڌյ459m=V?$nDEǓƺLA9scЖ:S67E ~஌򝺖s+:#hjyfr 7ѯg8xa:xC|ᎊq|s&>C9l J Y2x6f}N&(h¾ $6`X 6uqlb&&svf32s)` g_$KN;hKbn<~hDG ^}+ILf$Zm b0044a}gy b[J^Eۄ]G Ck)+[hsWa!Pi6@pljXwp!eVNw։Ke+er;G::K>j]\q-10Nn;ӈei`ເa)+g )̂j`(c2Qulَ`~`a7f@@Ħ\0,9ASl]GdRaMslAi%{vǑc`uGˇUdrܛu~ybA VEǮ!ԕ{^D7? ;yNA+V5_[y::;E,و<&Owծ;XX2hvsOT% H|Xx}\kMoL^ I,Y@x &eϊS6?0:‚!Loj9}\i ccU=i5/!fx:p%1&[Ft4w%"/n\|rW1|~3:֡#NTi@LޣLYvǪcˏwVr u׌q$oS&d$QnFgdIꮯU8 (al1 sG<>"qKϕ @TdԒRM qnX|ivWcxi M m(BEŀch0QaVLD m4(P:F~ TZ* xqD;8cYGgU:IpN*7V V=1dPD&qzjʊqX)-3l/~,5Ͷ4 +)#TPχO jkITQز1܂ mA¬XP8mP[NQ9;.,^bU,$oe4OQ3@LoS= 9w$q%Ӭ0j"V.h8q0 A7*0XE8bhmx8<~_UEyWG%PL0a3e@4s D}~l Up4h*lis %ʌS+7t-{#pPp.;V7掘vuYVMX6U5[oUH)Bi3U#xOWJvlu*>`NvrMNh:!tTAH}ݤ4M=(&`{iR#JTvdV d +tf=j; B,4nijJ@y@p% 7f]` FR9'eɍe"F4C+²Bvpf{*f0Ʊ@4zyŘjO ׋qlP$ɞ@%MCaILTٳʪXRw:5n 89Mوd hV +K [$_9Mkc)^Vܴ_X1 C"JH~:ǧ u5׵w'4A܋Gi|HX#F0웃!wu3&X\(v6k"U3X3L2aE嵷w<֌! ",:8ra?E%C j#%<T AQFE6,\s\ (dqKPmq5u0@T=`P:+:p1u$ƃN7Pe\,g$VbT5 = 4@G P: cR؊1}'oH^QD k8puK#-;!5$ w%KAlјplxhZxʨQ"Ms;t"l8 p ΰ ;PZWTQ:eѢy9k:ZoiLYb*`2*G'bK(@C@oĕmFxaȇ,Ww l Ww$J2ʇ2*ߩT;/ 4kn뇌 ME>vs,݁pљ'p]6&jFk/6;| z\)B-p,m[3wHkk1U*Ҙw {_ȁB$lrDFv!8,]N eGdKt%&4Sh@Ix\:l2b*'~Kh"? m8dd #yKtǺug@ֽ(% iess{EDnf@5SquhE$ecøe3͚2¯;lj ȓBt\PAKzuьqiCAL4 +PD1Tȵ]MFY;\$ cpt>yv>[=:M^F KEP؍^LS, FNG<7m!&T|b7,Co@f V0f<@!)$$PBfȉ0՛ꁹ'@#'@ RYd;Tt IB\2`8uCQMMew'LćwazkI8_v]M,HN@V7- 3@`GHї}e AbˇOغU IpDդ^a#.$0!csC( dw@4!c  +gX|G֭[1DB:;v0%C@Xw$;eXczT^L6!Dťk Y,1GDT$X&mRBū^B 톃KB4"J7  [R,) {NUA@*0"XU-N(Q.$_lhbу'ύe+j Of:K!lcn0Xu3gdaeUx$ƺ}Ӕ³b?q)eaC^v@u;Mд0;իvA*ϕ=o^V, UbGY3Bƪ&^h'1܎]ǁH 3^Qю HmC惖n lPMSؘOA(ȅXb Ȉu[hN|Z" 1_d >rН'N ;0QU__wt6#^Z ؓ*l>VEPi-Gȵbeo'i (:YG Ņ8ߋm inMp> !F\ǹr`#F0#T䠠vvk|~0a7XU&;-УY+X1aLedXW!b u'410\ atDi26 [.?)Aq%}eސ""5L ᢽd5UW]do!g@@]\5bW2PLqw npL̵9nE8V?4 SpPX`RԲq!2y$rPxXY>ӱlT׉Gj1\1;5rC"FLtt3 Ӛ_wC. x00&$hEXp?@,=2巺cwE$S3tlOI| :pŵq8<&HR[#kJ\;0еȡ4aWg䫈%C!["!zr@ٯ dHL@_`6!BXBrn1)불KvNǺرra ւ`\;cp03eGjW)@q;Sc0D&/[m;V(@ɣ> zI -2TZ:̍ssgʎAqEx*gG]cW#A$/DbH<& guCQc "žAJJp3$xe@;$ OYRmu 6$FBhN8$EWbH4Dp0}Qq}@J( ܂ULO[(aCN)y-H0VUWTVԡp Q>/^X(^\#y|-CU@aq DeHLUr'%,A*l0ɃTP!~lusc JxHh.࠹0< ?jJs>'9Н(v/Ҋi%]*W2 B!iFԞTQTsH>p' <`Ihmǒb{*T@MQOȦPq l|>J6`BG%<1-*_ P(lC[Bejge#DPsG 9qnW{8TɃR^^S^ҥS( ;Ecl3]xb9"*F*&IQkN V;:1y0Hǻ$ zGL#^V7lLp5/g]*/g6'9=A7ۑ0sOgIBAumlοsd4)@Ir~f+CI$Ķ#mbNJv.sb45!4lBgT#+̫fgZR1'<+ uL&@\a 8y94 ߉?U枉eȀk[tS D!k- '^`ApB 5Bf>I]Sat*J@,#yF+? oϐt0#Sbq3[X&N%K.%ڴQo_SAԭN|.{$Ym@  aK%;MjL5@O2ޏ3kl1Bi<.H\|_%ݾJRƣ$x`hM$Bb2 =pH$ U]q0$^^/dPPfXbb`,7@0=Ru嘦d 'A ِ䣚V'/H@X*PO?F0)I5 sR4ߴT1L8LkoVDA .Mw\e|0A *"Ap1߄]ͭH& |&*LLs!¿>,qɪ}f@nX |~)&5Ff0qD202LIENDB`@@@ NormalCJ_HaJmH sH tH Z@Z  Heading 1$<@&5CJ KH OJQJ\^JaJ V@V  Heading 3$<@&5CJOJQJ\^JaJDA@D Default Paragraph FontRi@R  Table Normal4 l4a (k@(No List4 @4 Footer  !.)@.  Page Number4@4 Header  !6U@!6  Hyperlink >*B*phv  !z !z!z!z!z!z!z z z z z, $d1B=IUVblv%" j  56789:;<=WdxyzHUVbc6 7 # $ ?@opU"V"$$%%( (D)E)p*q*,,..d1114455"8;8<8*:+:E<F<>>>>S@T@CCXHYHJJKNXNYNRR/UQURUWWXWYYYYl]]]__bbb~dddffhh hiikkkooqqssuuvvv! !!!!!!!!t!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! !!!! !! !!!!!! !!!!!! !!!!!!!!!!!!! !!!!!!!!!!!!!!!!! !!!!!! !!! !!!!!!!!!!! !!!!!!!!!!!!!!!!!!!!! !! !!!!!!!! 56789:;<=WdxyzHUVbc6 7 # $ ?@opU"V"$$%%( (D)E)p*q*,,..d1114455"8;8<8*:+:E<F<>>>>S@T@CCXHYHJJKNXNYNRR/UQURUWWXWYYYYl]]]__bbb~dddffhh hiikkkooqqssuuvvvvvvvvvvvvvvvv00000000000000000000000000V000000000000(00 00E00E00E00E0E0(0070000(00I500000000000(00F=0@0000(000@00@000(00=000000000(00iM000(00S0000(@0@0@0(@0@00@0@0(000(00000(0000000000000000@0@0@00000000000@0)0 00 0000057 # $ opU"V"%p*q*,,455+:E<F<CXHYHJJKNXNYNRR/UQURUWWXWhh hiikv000000 ?fguv! B!B&B&B'B)BkCpCCCCCEELLPQQQRRRRR SKS\SeSeSTbT U-UWXIXdXYvvvvvvvvvvvvvv@vv0vv(^_v@f@UnknownGz Times New Roman5Symbol3& z Arial"1hFʧFe <e <q4 `vv2QHP ?2 %filter content network block product cvincentcvincentOh+'0`    ( 4@HPX  cvincent(filter content network block product Signature-based Content Filtering SurfControl Web Filter Inline Filtering vs. Eavesdropping Content Filtering: Sifting Through the Mess URL Categorization and Port blocking Normal cvincent25Microsoft Office Word@@<@R!@v9  e՜.+,0 hp  SEWP Security Center<v\  Title  !"#$%&'()*+,-./0123456789:;<=>?@ABCDEFGHJKLMNOPQRSTUVWXYZ[\]^_`abcdefghijklmnopqrstuvwxyz{|}~Root Entry F Data Ia1Table3WordDocument4SummaryInformation(DocumentSummaryInformation8CompObjq  FMicrosoft Office Word Document MSWordDocWord.Document.89qRoot Entry FУt˻Data Ia1Table3WordDocument4SummaryInformation(DocumentSummaryInformation8tCompObjq  FMicrosoft Office Word Document MSWordDocWord.Document.89q՜.+,D՜.+,@ hp  SEWP Security Center<v\  Title4 $,