Quantcast
Are you the publisher? Claim or contact us about this channel


Embed this content in your HTML

Search

Report adult content:

click to rate:

Account: (login)

More Channels


Showcase


Channel Catalog



older | 1 | .... | 5 | 6 | (Page 7) | 8 | 9 | 10 | newer

    0 0

    The Streamlined Topology is the ‘hot new thing’ with regards to SharePoint topologies. While the guidance is written for SharePoint 2013, it can be used with SharePoint 2010 as well, with the exception of the Very Low Latency tier. The explanation of this topology is that it provides better performance with low latency for end-user facing services. Let’s demonstrate the Streamlined Topology performance! This scenario should apply to most Service Applications that are designed to run on the Low Latency tier within the Streamlined Topology. Here, I’m using SSRS simply because it provides an easy-to-access render time (or, end user performance) within the Execution Log. The key number we’ll be looking at is the TimeRendering value. This value will change depending on which server is processing the report. In this example, SP04 is the Low Latency tier — that is, the server running the Foundation Web service that end users are accessing. The other server is SP05, or Batch tier server. While for example purposes, it also has SSRS installed, it is not the SharePoint server end users are directly interfacing with. This means that the Low Latency tier, SP04, must communicate with the Batch tier in order to gather data to render the Report. The report created is basic. It includes a Bing Map, a Tablix, and a Matrix. All three are attached to a Data Source with data coming from a SharePoint List on the same Web Application. With the exception of the default Service Instances and SQL Server Reporting Services, there are no other Service Instances or services running on the SharePoint servers. The virtual machines are running on the same SSD, with the SharePoint 2013 Service Pack 1 VMs allocated 8GB vRAM and 4 vCPUs, while the SQL Server is set with dynamic memory, 2GB startup and 1 vCPU. All servers are running Windows Server 2012 R2 with Update 1. This example uses the following query against the Reporting Services database in order to collect the performance data. [crayon-53ff4c99c6b67428346980/] To get a better gauge of performance, each time the Service Instance is moved from one server to another, an Iisreset is performed. In addition, the first three rendering results of each run are discarded due to JIT’ing of assemblies. The Report is not cached and does not have snapshots enabled. The test will start with the Service Instance SQL Server Reporting Services started on SP05, the Batch Tier, and stopped on SP04, the Low Latency tier. Using Internet Explorer 11 on Windows 8.1 Update 1 and navigating to the Report directly, the Report is refreshed with Ctrl-F5 in the browser (or, force reload). Again, the first three results are discarded, while keeping the five remaining results. Here is the raw data for the run on SP05: Next, the reverse test is performed, starting SSRS on SP04, the Low Latency tier, and stopping SSRS on SP05, the Batch tier. When SP04 is running SSRS, there is an average reduction in render times of roughly half a second, or 936.8 ms on SP05 versus 574.4 ms on SP04, a difference of 362.4 ms. While this number may not appear to be significant, the Report rendering time feels much faster with SSRS running on SP04. And how a page feels is extremely important for the end user! This example Report used in this test is basic, but consider more complex reports that you likely have in your environment where the delta would be even higher where rendering performance is crucial. While the Streamlined Topology model should be a serious consideration, it also requires appropriate allocation of hardware (or virtual machines). If the Low Latency tier cannot handle the load of both end users as well as the Service Instances, performance for the end user isn’t going to be at an acceptable level. In this case, consider the Traditional Topology model instead (the classic “WFE” and “App” servers). Another consideration, if the Business Intelligence services consume too many resources (CPU and memory), isolate them onto their own tier. While this will increase the rendering time for those Service Instances compared to a Low Latency tiered server with enough resources allocated to it, this solution is better than a Low Latency tiered server that cannot handle the Service Instance. In addition, since it is just an isolation of the Business Intelligence services, other Service Instances still benefit from the Streamlined Topology model. One important consideration to note with the Streamlined Topology model (and even the Traditional Topology) is to dedicate servers to the Very Low Latency tier. This is where the Distributed Cache service will run. The Distributed Cache service is very sensitive to latency, and other services running on a Distributed Cache server can negatively impact this service, causing poor performance for end users. With servers that have proper resource allocation, I believe the Streamlined Topology should be the topology of choice for new SharePoint farms.

    The post Streamlined Topology Performance appeared first on Nauplius.


    0 0

    The September 2014 Cumulative Update for SharePoint 2013 has been released. SharePoint Foundation: http://support.microsoft.com/kb/2883087 SharePoint Server 2013: http://support.microsoft.com/kb/2883068 Project Server 2013: http://support.microsoft.com/kb/2883072 Office 2013 September 2014 Cumulative Updates: http://support.microsoft.com/kb/2995905

    The post SharePoint 2013 September 2014 Cumulative Updates appeared first on Nauplius.


    0 0

    The September 2014 Cumulative Update for SharePoint 2010 has been released. SharePoint Foundation:  SharePoint Server 2010: http://support.microsoft.com/kb/2883103 Project Server 2010: http://support.microsoft.com/kb/2883006 Office 2010 September 2014 Cumulative Updates: http://support.microsoft.com/kb/2995904

    The post SharePoint 2010 September 2014 Cumulative Updates appeared first on Nauplius.


    0 0

    SharePoint 2013 and Office Web Apps communicate authorization via a JSON Web Token. This JWT contains important information about the caller, such as the username, SID, time-related information, as well as where the request is originating from. JSON is insecure by default and the JWT is easily unpacked to reveal the contents of the JWT. So, how does this apply to SharePoint? If you’ve followed what I do for any period of time, you’ll probably know I’m in the SharePoint-related TechNet/MSDN forums on a daily basis. I see a lot of fun configuration issues, and so forth. But one thing I’ve seen on a fairly regular basis is Office Web Apps administrators setting their WAC farm to AllowHTTP = $true. This means the insecure JWT is being passed over the wire in plain text, instead of WAC’s default setting of leveraging an SSL certificate for transport encryption. To demonstrate this, two Domain User accounts are created. NAUPLIUS\User1 and NAUPLIUS\User2. User1 is granted Contribute rights on a Word Document in a SharePoint site. User2 has no access to the Word Document or the SharePoint site. When the user opens the document in Word Web App, a packet capture will be performed to capture the JWT and URI.   Here we can see three key pieces of information, the access token value (“eyJ0eXAiOiJKV1QiL…”) and the Web the file is located on (“http://spwebapp1/sites/team”). The JWT can be unpacked to examine the contents: [crayon-541e2af396a3b089217980/] From the JWT, using the first value in appctx (“def25ded79194e09bf5e340635599a85″), it is trivial to construct a web request to view this document. Using a custom console application that fires off the WebBrowser control, the above information can be inputted and viewed under any user account, regardless of their access level in SharePoint. In this example, NAUPLIUS\User2 has leveraged the JWT token from NAUPLIUS\User1 to view the previous document. And remember, NAUPLIUS\User2 has no access to the Site Collection or the Word Document. Based on this, it is technically feasible to leverage the JWT to impersonate another user. Having access to the JWT is the same as having a username and password when it comes to applications that use JWTs. The lesson on display here is that transport encryption is vital to securing a JWT, typically in the form of SSL communications. If SSL was enabled on the WAC farm, it would not be possible to view the JWT. Always use SSL in production environments, and consider it in development environments that traverse networks or contain sensitive data.

    The post The Dangers of AllowHttp for SharePoint appeared first on Nauplius.


    0 0
  • 10/02/14--10:44: Renewed as a SharePoint MVP
  • So I’m one day late, but yesterday I was renewed as a SharePoint MVP for the 3rd year. I’m extremely honored and grateful to be awarded! Lots of big plans for open source solutions this year, and more product reviews. You can find more about the MVP Program at the Microsoft Most Valuable Professional site as well as at the MVP Award Blog.

    The post Renewed as a SharePoint MVP appeared first on Nauplius.


    Warnning: Do NOT Get Caught While Searching!!
    Your IP : - Country : - City:
    Your ISP TRACKS Your Online Activity! Hide your IP ADDRESS with a VPN!
    Before you searching always remember to change your IP adress to not be followed!
    PROTECT YOURSELF & SUPPORT US! Purchase a VPN Today!
    0 0

    In order to use SQL Server Reporting Services in a multi-tenant situation, follow the standard setup for SSRS. That is, install SSRS on a SharePoint server, start the Service Instance, create a Service Application and Proxy, and assign the Proxy to the Web Application hosting your multi-tenant sites. For the Feature Pack, there are only two Features that are required to enable Reporting Services support. In this example, these Features are added to an existing Feature Pack. [crayon-545b1970213c5910652830/] That’s it. Now the Site Collection “Report Server Integration Feature” and Web “Report Server File Sync” features can be enabled and the SSRS Content Types will be available.

    The post Reporting Services Feature Pack appeared first on Nauplius.


    0 0

    SharePoint vNext is right around the corner going to be out any day now done when it’s done. That being said, I have a few wishes or SharePoint vNext from an IT Pro perspective. These are in no particular order, but here is my SharePoint vNext Wish List. Administrator audit log. Auditing support for actions performed by administrators, for example, changing values of objects like SPWebApplication in PowerShell. This data should be logged as plain text and/or into a separate SQL database. Reporting can be the standard Excel, with APIs for expansion. Full SQL retargeting support without an alias/AOAG. I should be able to use a cmdlet to update the registry references to a Configuration database. This enables mobility in scenarios where a SharePoint Administrator has not implemented an alias, by design or mistake, or in scenarios where only the Configuration/Administration database must move while leaving the remaining databases behind. A read-only Farm Administrator (auditor?). This enables the business, rather than IT, to review what a SharePoint Administrator has configured. SharePoint Management Shell access without Local Administrator access. Let me delegate tasks that do not require Local Administrator rights (e.g. provisioning a Web Application requires Local Administrator rights, just remove that functionality from the Shell just like Central Administration). A SharePoint cmdlet to replace stsadm -o sync. The -o sync is the last stsadm command I leverage on any regular basis as access to the API is limited so it isn’t possible to provide a (supported) PowerShell cmdlet replacement. Configurable SPWebApplication Max File Size. This boils down to a single property on the SPWebApplication, it’s just currently hard coded to 2GB. Allow ‘stretched farms’ of greater than 1ms/less than 1Gbps. Provide a range of supportability, e.g. 1 – 10ms is okay, >11ms is not. Provide per-Service Application guidance. Farm-wide VSS provider. Support for Hyper-V Replication/snapshot/online backup. Briefly make.a call to ‘freeze’ the state of the farm. Right now administrators have to contend with patches that may break either during deployment, or briefly there after during patch validation. Being able to apply a snapshot to a farm would allow immediate reverting… because let’s be honest, testing a patch in pre-production isn’t always the same as deployment and testing in production. Provide selectable components for Backup-SPFarm. Right now the story is backup the entire farm or a single object. This is rather inflexible. SharePoint virtual machine template support. Let me build a SharePoint VM, add it to the farm with particular services (Foundation Web, Excel Calculation Services), then flag it as a template within the farm. Integrate the removal of the server with sysprep. The sysprep process would remove the server from the farm, but store the template information within the resulting VM template. Upon deployment (with SCVMM, as an example), I could select the template and I would then immediately have a server with my pre-configured services. Alternatively, build in a ‘template’ to psconfig/Config Wizard. The template would have pre-defined Service Instances/server configuration data that would simply be a selection. On a run of psconfig/Config Wizard, I would input or select a pre-defined template and that server would be ready-to-go immediately after deployment. Support to trim the Audit Log in Content Databases with a scope of less than 1 day. The API is currently limited to day increments. This causes issues if enabling full auditing support on a farm with heavy traffic. The Audit Trimming command may fail with a scope even as small as a day. In addition, allow me to simply drop all data within the Audit data table. Support for also exporting the audit data via automated/timed process to an external database. Even better, support to create audit data directly in a non-Content Database. Audit data can cause significant bloat, and it is also possible to have “orphan” audit data, where the Site Collection has been deleted, but the audit data remains in the Content Database. The only solution here is to migrate all Site Collections to a new Content Database and abandon the existing database. Another alternative may be to just allow one to specify a GUID (Site GUID) to clear anything with that GUID in the audit data table. Support AlwaysOn Read-Intent. When performing a read-action, query the read-only copy of the data for performance improvements. Remove the Timer Job cache from disk. Store that data in-memory. Faster to query, less prone to host-based antivirus issues, and if the memory can be kept in-sync with other farm members, preventing the “object update conflict…” type errors (AppFabric for timer job cache? I kid…). Azure database (PaaS) and Azure Blob Storage support. Much of this would be dependent on latency. Move PowerPoint and Word Automation Services outside of SharePoint into Office Web Apps. Provide CSOM APIs for these services. — What are your wishes for the next version of on-premises SharePoint?  

    The post My SharePoint vNext Wish List appeared first on Nauplius.


    0 0

    The November 2014 Cumulative Update for SharePoint 2013 has been released. SharePoint Foundation: http://support2.microsoft.com/kb/2899468 SharePoint Server 2013: http://support2.microsoft.com/kb/2889944 Project Server 2013: http://support2.microsoft.com/kb/2889949 Office 2013 November 2014 Cumulative Updates: http://support2.microsoft.com/kb/3012396

    The post SharePoint 2013 November 2014 Cumulative Updates appeared first on Nauplius.


    0 0

    The November 2014 Cumulative Update for SharePoint 2010 has been released. SharePoint Foundation:  SharePoint Server 2010: http://support2.microsoft.com/kb/2899478 Project Server 2010: http://support2.microsoft.com/kb/2899479 Office 2010 November 2014 Cumulative Updates: http://support2.microsoft.com/kb/3012395

    The post SharePoint 2010 November 2014 Cumulative Updates appeared first on Nauplius.


    0 0

    Attaching a Content Database from a Classic Web Application to a Claims-enabled Web Application does not automatically convert the users contained within that Content Database from Classic to Claims. When you run Test-SPContentDatabase, it performs quite a few tests against the target Content Database, including a test to check if users within the UserInfo table are Claims-enabled on a Claims-enabled Web Application (and vice versa). [crayon-5479e586b68c3262094330/] When a Test-SPContentDatabase is executed (without additional switches), it runs the following T-SQL against the UserInfo table of the Content Database being tested: [crayon-5479e586b68dc482096796/] If it returns any results, the message above will be displayed. So what is the solution, here? Just migrate the users… [crayon-5479e586b68f5525880571/] This will migrate any valid users within the database from Classic to Claims. Running Test-SPContentDatabase should now succeed without any warning message, unless… Any Site Collection Administrator has been deleted from Active Directory and cannot be converted! In that case, identify all Site Collection Administrators in the Content Database (e.g. [crayon-5479e586b68ff087404048-i/]), remove them from the Site Collection Administrators and as an optional step, delete them from the Site Collection. This way, they will no longer be marked as [crayon-5479e586b6907328386092-i/] and will not fulfill the requirements of the Test-SPContentDatabase query on the UserInfo table.

    The post Test-SPContentDatabase Classic to Claims Conversion appeared first on Nauplius.


    Warnning: Do NOT Get Caught While Searching!!
    Your IP : - Country : - City:
    Your ISP TRACKS Your Online Activity! Hide your IP ADDRESS with a VPN!
    Before you searching always remember to change your IP adress to not be followed!
    PROTECT YOURSELF & SUPPORT US! Purchase a VPN Today!
    0 0

    The December 2014 Cumulative Update for SharePoint 2013 has been released. SharePoint Foundation: http://support2.microsoft.com/kb/2910945 SharePoint Server 2013: http://support2.microsoft.com/kb/2910938 Project Server 2013: http://support2.microsoft.com/kb/2910911 Office 2013 December 2014 Cumulative Updates: http://support2.microsoft.com/kb/3020816

    The post SharePoint 2013 December 2014 Cumulative Updates appeared first on Nauplius.


    0 0

    The December 2014 Cumulative Update for SharePoint 2010 has been released. SharePoint Foundation: http://support2.microsoft.com/kb/2899585 (Not an uber package) SharePoint Server 2010: http://support2.microsoft.com/kb/2899583 Project Server 2010: http://support2.microsoft.com/kb/2899587 Office 2010 December 2014 Cumulative Updates: http://support2.microsoft.com/kb/3020815

    The post SharePoint 2010 December 2014 Cumulative Updates appeared first on Nauplius.


    0 0
  • 12/12/14--08:13: FoundationSync 2.5 Release
  • FoundationSync 2.5 for SharePoint Foundation 2013 has been released! This version adds functionality to pull user photos from Active Directory or Exchange Server 2013. More information can be found at the FoundationSync wiki.

    The post FoundationSync 2.5 Release appeared first on Nauplius.


    0 0
  • 12/22/14--09:57: SPCAF Review
  • Disclaimer: SPCAF is free for Microsoft MVPs and I was provided an MVP license to use this product. I will be taking a look at the SPCAF Client, Visual Studio integration, and PowerShell for this review. The primary purpose behind SPCAF is for code analysis, correctness, and of course a migration assessment from SharePoint Solutions to SharePoint Apps. SPCAF does not require SharePoint binaries, therefor it can be run on virtually any Windows desktop system, which is great news for IT Professionals and developers creating SharePoint Apps. I’m going to take a look at my solution, FoundationSync. This is a solution geared towards SharePoint Foundation and updates the User Information List for each Site Collection on a daily basis, with the latest version for SharePoint Foundation 2013 pulling pictures from Active Directory or Exchange 2013. To get started with the SPCAF Client application, on the Home Page, simply drop one or more solutions (.wsp and .app are supported) into the wheel and click the Play button. When solution analysis has completed, it will present a screen similar to below, notating code quality issues, inventory of the solution, internal and external dependencies that your solution takes, and metrics. Notice the Setting next to the project name. This is running with the default rule set, which is an extended rule set. Many issues may be flagged by this rule set that are not applicable to your solution, or that the solution must perform. The Code Quality tab presents a very easy to read and filter view. Clicking on any one of the levels, such as Errors, Critical Warnings, and so forth will filter the results to that level. Below this screen are the actual listed errors, with a description, resolution, links if applicable. It also displays the portion of code where the rule was violated. In this case, because it is the SPCAF client, the solution is decompiled and may not exactly match the code that was created, but it is close enough for these types of rule violations. In this particular solution, the SPFarm property bag was leveraged to store certain values. In order to create or update a property bag item, SPFarm.Update() must be called. According to SPCAF, this is a rule violation. However, because this was the preferred, and frankly easiest method to store this information, this particular violation isn’t applicable. For this rule, SPCAF does provide a help link to understand why this is considered a rule violation. Again, because this violation will be ignored, I can consider decorating the method and suppressing the rule. SPCAF outlines this in the help documentation online. This is an example of a rule that should be considered. The guidance I’ve always heard is “dispose of SPSite and SPWeb”. Apparently disposing of SPSite.RootWeb should not be done. Again, because this is decompiled, it was not exact, as I was performing a “using(SPWeb web = site.RootWeb){}” in this particular case, but as a developer, it is easy to translate. Another example of a warning would be a missing image for a Feature. While this isn’t particularly helpful, or worrisome, it is an example of something that can be easily missed and quickly corrected. To conclude the IT Pro or QA toolset, SPCAF recently released a PowerShell module to analyse solutions via PowerShell which can be integrated into an automated process. While the module is free, most of the features require an individual running the cmdlet to have an SPCAF license, although the output of the reports can be shared with anyone. The PowerShell module can be downloaded from the TechNet Gallery. The package includes two example PowerShell scripts, the module, and a license file (.lic) that can be used to fill in the license information. To see the output of this cmdlet, I’ve made the reports available here FoundationSync_SPCAF_Results. The output from PowerShell is included below. [crayon-54a4ae7571dbc062012909/] For the developer, Visual Studio has a similar toolset, but instead of examining a WSP, it will compile your code immediately before analysis and provide the output in the Error List. This is a much more convenient solution for developers that outputs to the Error List window. As with the desktop client, here you can also create a Dependency Graph (DGML) for your solution. Just like with the desktop client, reports are exportable to allow others to consume them! This is great, because this means not everyone requires a license to view the output of SPCAF. To provide a summary, RENCORE AB has provided 3 different methods of reviewing and evaluating code quality for developers, QA analysis, and IT Professionals. SPCAF has a trial license available which provides a good overview of the capabilities of the product. Microsoft MVPs, MCSMs/MCMs, and Microsoft FTEs may also request a SPCAF Enterprise license for free. I would highly recommend this from the IT Professional aspect simply because it may provide a way to keep code quality up and prevent lower quality solutions submitted by developers causing stability issues with a farm.

    The post SPCAF Review appeared first on Nauplius.


    0 0
  • 12/31/14--08:30: Top 10 Posts of 2014
  • This year was a lot of fun! A handful of speaking engagements at the Puget Sound SharePoint Users Group, one at SharePoint Saturday Redmond, along with organizing volunteers for SPS Red. And of course, a lot of posts on SharePoint! Onto the top 10 posts of 2014, view-wise. Overall, view-wise, #1 and #2 were actually from 2013! Old posts get a lot of attention, but they don’t count. Workaround for April 2014 CU and MS14-022 Double Encoding Bug This one I’m really not surprised about, at all. This was a bug that was active for roughly 6 months, before being resolved in the September 2014 Cumulative Update. It was a major issue for on-premises farms, and while there was a “supported” workaround through modifying Display Templates, that involved a significant amount of labor and did not cover all cases of this particular bug. This workaround was definitely unsupported, but quick to implement even with a large number of servers in the farm. MS14-022 Known Issues Again, not surprised at all. Keeping people up-to-date on active, long running bugs is extremely important. During the time frame this bug was active, 2 subsequent SharePoint security hotfixes introduced this same bug. It is a careful balance between making sure your farm is up-to-date and secure, and the bugs those security hotfixes introduce. SharePoint and the Web Application Proxy Role A brand new feature introduced with Windows Server 2012 R2, we started seeing the future replacement of Threat Management Gateway and Unified Access Gateway. UAG was announced as a discontinued product in December of 2013, while removed from the price lists mid-2014. TMG was discontinued way back in 2012. While WAP in 2012 R2 doesn’t function correctly with SharePoint Apps due to the operating system limitations, this has been corrected in the preview of Windows Server 10. PowerShell for People Picker Properties Microsoft guidance for many of the People Picker properties, such as configuring a One-Way Trust, is to leverage stsadm. Stsadm has been deprecated since SharePoint 2010. This post covered some key areas where stsadm can be replaced, including setting the Application Credential Key, critical to configuring that One-Way Trust. SharePoint 2013 April 2014 CU Claims Conversion Bug Another post, another bug! Fortunately this particular bug was short-lived, corrected in the June 2014 Cumulative Update, but it prevented the Classic to Claims conversion. It also added some yet-to-be documented switches to Convert-SPWebApplication cmdlet. Microsoft, do you need another tech writer for TechNet? :-) SharePoint 2013 Service Pack 1 Released Of course, Service Pack posts are popular. They only come once a year, and they’re a big deal for a lot of administrators who stick to just the major builds. People Picker Troubleshooting Tips These are general tips of what I often look at while troubleshooting various People Picker issues. They cover some of the most common scenarios that I’ve seen on the TechNet forums. What is the SharePoint Configuration Cache? Clearing the Configuration Cache (aka Timer Job Cache) seems like the panacea of SharePoint fixes. While we’re told to do this, it often comes with a lack of understanding of what the Configuration Cache actually is. This post also explores the inner workings of the Configuration Cache. The Expense of Application Pools The advise for SharePoint in terms of configuring Application Pools has changed over time. Back in 2007, everything was to be separate (within various limits). Or for “security purposes”, Service Applications had to run under unique identities. But starting late into SharePoint 2010’s lifecycle, with the increase use of BPOS and eventually SharePoint Online in Office 365, Microsoft changed their tune telling us that less was far better. One thing I hadn’t seen was why it was better, and even I was surprised by the results. Less really is better! SharePoint 2013 and Office 365 with Yammer Integration Service Pack 1 for SharePoint 2013 introduced Office 365 for MySites/OneDrive for Business and Yammer integration. This was a simple post detailing how an administrator would go about setting this functionality up.   Below are my personal favorite posts of this year. Streamlined Topology Performance This was another recommendation by Microsoft for SharePoint topologies where we were just told “it is better”. Using Reporting Services as a test for this topology (primarily because the numbers are very easy to access), it shows just how that extra trip between the “WFE” and “App” server (the “traditional” topology) can have a negative impact on performance, and why an administrator should consider converting to the streamlined topology. The Expense of Application Pools Another fun post, using the Sysinternals VMMap tool. I don’t often get to do this type of investigation, simply because it is just so easy to throw more memory at a SharePoint server than to optimize the layout. Hyper-V Private Networks for SharePoint While a lot of developers will attest to running SharePoint in the cloud (Azure, CloudShare, etc.) for testing purposes, my test labs often involve an entire infrastructure; Active Directory, Exchange, SharePoint, possibly reverse proxies, ADFS, and so on. While testing just SharePoint environments in the cloud is great, the expense of testing the whole infrastructure is way outside of my personal budget. The strategy outlined in this post was one I used frequently on laptops where I’d move around from network to network, but needed to maintain a private IP range and Internet access for my VMs. SharePoint with Apache mod_proxy I had seen this one, and even received requests of a “how to” from Microsoft employees on information for their customers that already deployed mod_proxy/mod_ssl in their environments and wanted to put SharePoint behind it. What can I say, I like reverse proxies! SharePoint Database Availability Group Cmdlets These cmdlets when unmentioned on TechNet until November of this year, and even then are only mentioned in the Move All Databases TN page. Learning how these cmdlets work is essential for those deploying SharePoint 2013 SP1 or higher against an Availability Group, as it opens up new options for the SharePoint Administrator, and reduces the workload for the DBA. What is […]

    The post Top 10 Posts of 2014 appeared first on Nauplius.


    Warnning: Do NOT Get Caught While Searching!!
    Your IP : - Country : - City:
    Your ISP TRACKS Your Online Activity! Hide your IP ADDRESS with a VPN!
    Before you searching always remember to change your IP adress to not be followed!
    PROTECT YOURSELF & SUPPORT US! Purchase a VPN Today!
    0 0

    Full disclosure: MVPs are provided an SPDocKit Consultant Ultimate license, and I was also provided a Farm Ultimate license specifically for this review. SPDocKit, at its core, is a documentation generation engine for SharePoint farms. It produces customizable documentation about the SharePoint farm that would take quite some time to otherwise script. Based on a previous review of SPDocKit, this SharePoint DocKit 5.0.1 review will examine the new and revised features added since that time. If you’re upgrading from earlier versions of SPDocKit, like all updates for SPDocKit, the process is extremely easy. SPDocKit will detect the existing settings and upgrade everything as needed. Otherwise, there will be an option to create a new SQL database for SPDocKit. The SPDocKit database should on the SQL instance where the SharePoint Usage database is. This is a requirement in order to leverage functionality like the Feature Usage reports. Depending on the SPDocKit license, an option for enabling the SPDocKit Service will be available. This allows you to schedule snapshots of the farm configuration. The account running the SPDocKit service should have Local Administrator as well as Farm Administrator rights. In the case of this review, the Farm Administrator account will be used, but a dedicated account is likely more appropriate. Load Options provides choices for what type of data to collect about the farm, the more data that is collected, and the larger the farm, the longer this process will take to complete. This process can be very slow. If the SharePoint ULS and Windows Event Log options are selected, the Event Collection interval schedule will be available. This is where the snapshot schedule is available, depending on licensing. Once this configuration is complete SPDocKit will start running. The first step is the Load Farm Settings. This process will load the settings selected, populate the database, and take a snapshot of the current state of the farm, ULS, and so forth. The first option is to load Farm Settings and Permissions. Choosing Permissions will likely add a significant amount of time to the data collection process. For the Farm Load, again, the more options selected, the longer the load time.   Permissions Load allows the selection of the scope permissions should be loaded at. Loading down to the Item Level will incur a significant load time, even on a smaller SharePoint farm.   While with the default settings for Farm Load, this test lab farm loads fairly quick as it has less than 1GB of content, selecting the option to load permissions at the Item Level causes very long load times. Think carefully before attempting to load the permissions at the List Item level. Once the selections are loaded, it will automatically close the window or indicate if an error occurred. If an error does occur, scroll up to the item marked Failed and click the link. There will be a link to the SPDocKit site with an explanation of why it may have failed along with possible resolutions. The biggest functionality SPDocKit 5 brings to the table is Permissions Explorer. Not only can an administrator explore permissions down to the Item level, but it is also possible to manipulate permissions in a variety of ways. In this example, there are inherited permissions of the Announcement List. Only SharePoint groups are applied to the Annoucement List, yet it is possible to expand those groups to see who is a member of them. Each object (user or group) has a Properties page. This displays what type of account they are (AD User in this case) along with other basic information. Group membership is displayed of not only SharePoint groups within the Site Collection, but also Active Directory groups that may not be part of the Site Collection. This can help in identifying proper Active Directory groups to add to the site. The Permissions Explorer wizards are extremely powerful. From here it is possible to fully manage groups: And also manage memberships:     Another welcome feature, which is not available in the SharePoint UI, is a breakdown of nested group memberships, as well as users who are apart of groups but are not in the User Information List. This can be handy as it is now possible to view permission levels based on Active Directory groups for an individual user who may not have yet visited the site. More great functionality is the Clone permissions wizard. The wizard lets an administrator duplicate permissions at the Web Application or Site Collection level. This is a great tool to have when onboarding new employees where domain-based groups are not in common usage.   An administrator can also transfer permissions from one user to another; again another great feature when employees move on.   Additional wizards include “cleaning” a Site Collection, where it is possible to remove users and groups en masse. The last two wizards involve managing the Site Collection Administrators group and SharePoint Permission Levels. Queries and Rules is additional new functionality that is very powerful. Queries allow an administrator to gather and report on very specific information about Web Applications and Site Collections. For example, this custom built report on Site Collections and storage. From the report view, it is possible to filter and sort, just like Excel. And of course, the report can be exported to Excel, PDF, or Word.   Within each Query is filtering functionality, as well as scheduling functionality. Reports are persisted, so it is possible to schedule an automatic query execution and review the results over time. Rules allow an administrator to enforce specific behaviors on the target of the rule. In one of the samples provided, it will enforce Version History across Web Application(s), a Site Collection, Subsites of a Site Collection, or even specific Lists. Jumping to Best Practices, and this was reviewed the last time around, again be careful with the recommendations here. There are bugs (such as the Distributed Cache Size recommendation exceeding the maximum recommendation of allocating 8GB, which should be fixed soon) and there are other recommendations that won’t make sense for every environment. The rules follow […]

    The post SharePoint DocKit 5.0.1 Review appeared first on Nauplius.


    0 0

    The account running the IIS Application Pool leveraged by the PowerPivot System Service requires Full Read on any Web Application with the PowerPivot Service Application Proxy. However, the health rule governing validating this is true does not function correctly. The error in Review Problems and Solutions is: MidTier process account should have ‘Full Read’ permission on all associated SPWebApplications Let’s dive in to why this is happening, and how to resolve it. The first thing the Health Rule does is retrieve the process account name from the IIS Application Pool: [crayon-54b74ba2ce592316475999/] The equivalent PowerShell is: [crayon-54b74ba2ce5a7745516486/] Once the Health Rule has the process account, the next step is to identify who has FullRead rights in the SPPolicy on the Web Application. [crayon-54b74ba2ce5b0041859859/] Now, here is where it fails. [crayon-54b74ba2ce5b9657798128/] What the above code is doing is validating that the strings are equal. It is comparing the UserName on the SPPolicy to the Process Account running is the IIS Application Pool. In a claims-enabled environment, this is never true as the claims identifier on the SPPolicy isn’t equal to the Process Account. As for a resolution, simply disable the rule and manually add the ProcessAccountName to the Web Application(s) with Full Read rights. This can be done via PowerShell: [crayon-54b74ba2ce5c2678595923/] What about the automatic repair, and why doesn’t that seem to work? Pretty simple, and the issue is along the same lines. What the automatic repair does it first look at a Web Application, then proceed to enumerate each User Policy on that Web Application. For each user, it adds Full Read rights (even though that user isn’t the one running the IIS Application Pool the PowerPivot Service Application is assigned to). [crayon-54b74ba2ce5cb382342988/] The equivalent PowerShell code is: [crayon-54b74ba2ce5d3460866830/] When it does this, it grabs the username (say “i:0#w|domain\username”) and calls SPWebApplication.GrantAccessToProcessIdentity. This method is unable to translate the claim value into a Windows Identity, thus we receive the exception “Some or all identity references could not be translated” and the the repair process exits.

    The post PowerPivot Mid-Tier Process Account Does Not have Full Read appeared first on Nauplius.


    0 0

    Creating Search Alerts in SharePoint 2013 is a tad different than it was in SharePoint 2010. SharePoint 2010 had a SearchAlert class which allowed you to easily create the alerts. This class had a single dependency on the Web Application using it; there had to be a Search Service Application Proxy attached to the Web Application. In SharePoint 2013, it is slightly different. The SearchAlert class’ constructor has been marked internal, and although you can access the constructor of the SearchAlert class via reflection, using reflection is unsupported. The workaround is fairly simple, but it now has a dependency on the Web Application having a User Profile Service Application Proxy. Without the UPSA proxy, creating the alert will fail. Otherwise, this code is mostly gleamed from reflection. [crayon-54cc907099634107099767/] The keys here are setting the AlertTemplate to “OSS.Search” and setting MatchId to Guid.Empty.

    The post Creating Search Alerts in SharePoint 2013 appeared first on Nauplius.


    0 0

    Microsoft has opened up a UserVoice to receive customer feedback on what you would like to see in SharePoint Server 2016. This is the first time we’ve been able to provide direct feedback to Microsoft on the future of SharePoint Server. Take this opportunity to make your voice heard! Customer Feedback for SharePoint Server

    The post Share your feedback for SharePoint Server 2016 on UserVoice! appeared first on Nauplius.


    0 0

    The SharePoint 2013 February 2015 Public Updates have been released: SharePoint Foundation: http://support2.microsoft.com/kb/2920801 SharePoint Server 2013: http://support2.microsoft.com/kb/2920801 Project Server 2013: http://support2.microsoft.com/kb/2920796 Office Web Apps 2013 Server: http://support2.microsoft.com/kb/2956101 Office 2013 February 2015 Cumulative Updates: http://support2.microsoft.com/kb/3032763

    The post SharePoint 2013 February 2015 Public Updates appeared first on Nauplius.


    Warnning: Do NOT Get Caught While Searching!!
    Your IP : - Country : - City:
    Your ISP TRACKS Your Online Activity! Hide your IP ADDRESS with a VPN!
    Before you searching always remember to change your IP adress to not be followed!
    PROTECT YOURSELF & SUPPORT US! Purchase a VPN Today!

older | 1 | .... | 5 | 6 | (Page 7) | 8 | 9 | 10 | newer