Posts mit dem Label SQL Server werden angezeigt. Alle Posts anzeigen
Posts mit dem Label SQL Server werden angezeigt. Alle Posts anzeigen

Montag, 3. Juni 2013

New SQL Server 2014 announced


New SQL Server 2014 announced @ at TechEd North America now also under the „cloud-first principles “: For more Details have a look here: http://blogs.technet.com/b/dataplatforminsider/archive/2013/06/03/sql-server-2014-unlocking-real-time-insights.aspx
Stay tuned more news also what that means for SharePoint coming up soon….!

Samstag, 6. April 2013

SharePoint 2013 Analytics Feature - IT-Pro, DB Admin & PowerPivot issues


This post covers some facts and details about the new SharePoint 2013 Analytics feature. For common stuff about that feature have a look at this TechNet article: LINK
In fact we have two different types of analytics in SharePoint 2013:

·         Search analytics analyzes content that is being crawled and added to the search index

·         Usage analytics analyzes user actions, or usage events, such as clicks or viewed items, on the SharePoint site
Search Analytics & Usage Analytics processing is based on several components. Continuous or incremental crawls must be scheduled and run. Also there are several timerjobs involved:

·         Analytics Timer Job for Search Service Application Search Service Application

·         Microsoft SharePoint Foundation Usage Data Import

·         Microsoft SharePoint Foundation Usage Data Processing

·         Usage Analytics Timer Job for Search Application Search Service Application

For more details about the jobs and job descriptions have a look here: LINK
This picture shows an overview for the new components:

Search analytics

PowerShell
Some Facts:
·         There are some cmdlets for create / delete/ get / move etc.  SharePoint Analytics infrastructure issues:
o   SPEnterpriseSearchAnalyticsProcessingComponent
o   SPEnterpriseSearchLinksDatabase
·         There are no direct PowerShell cmdlets to get informations from the SharePoint Analytics
·         We can use the .dot net SearchServiceApplication class, from Microsoft.Office.Server.Search.dll. in combination with PowerShell to work with SharePoint Analytics
Here is an example script from Radu Tut showing how we can connect via PowerShell to SharePoint Analytics:
$searchApp = Get-SPEnterpriseSearchServiceApplication
$site = Get-SPSite "{SiteUrl}"
$result = $searchApp.GetRollupAnalyticsItemData(1,[System.Guid]::Empty,$site.ID,[System.Guid]::Empty)$result
#for a specific date or for a specific month
$date = Get-Date "2013-01-18"
$result.GetHitCountForDay($date)
$result.GetHitCountForMonth($date)
So for example using the first part of the script:
$searchApp = Get-SPEnterpriseSearchServiceApplication
$site = Get-SPSite "{SiteUrl}"
$result = $searchApp.GetRollupAnalyticsItemData(1,[System.Guid]::Empty,$site.ID,[System.Guid]::Empty)
$result
Gets the following result in my demo environment:

There are two methods available. You can use:
·         GetSearchReport
·         GetRollupAnalyticsItemData
Find complete post and info’s about using this in Radu Tuts blog here:  How to get Search Analytics Reports programmatically in SharePoint 2013

SQL Server DB´s

For me as an old SQL guy it was also interesting to see how this all will be established in SQL Server Databases. In an enterprise environment a direct connect to SharePoint Databases isn´t used in general, two point why this is interesting anyway:
·         It’s interesting to see how the analytics database is used to plan you databases files, partitions etc.
·         Only reading from the database using given functions and stored procedures can of cause be an option.
Additionally configuration overview for your environment can be found here: MSSSearchAnalysisEngineConfiguration in your Search_Service_Application_DB
The interesting table in Search_Service_Application_DB is [dbo].[SearchReportsData].
In this table the data for the Usage & Search reports is stored. The data from yesterday is imported once per day from log folder under:  %InstallDrive%\Microsoft Office Servers\15.0\Data\Office Server\Analytics\EventStore
 
The table shows the result in my demo environment:

Using PowerPivot for Excel we can connect to that table and analyze the data:

Usage analytics


The usage analytics data is stored in the logging database. This article describes how to access the data using SQL Views: LINK
Using these views together with PowerPivot we can create our own reports. Here is an example for RequestUsager showing all sites request by “Administrator” for CA:
Also there is an official way consuming Usage Data directly from the logfiles and use it to feed PowerPivot hosted in SharePoint itself: LINK

Dienstag, 2. April 2013

SharePoint 2013 empty Usage and Health data report


Why the “Usage and Health Data Report” can be empty in SharePoint 2013.

In SharePoint 2013 the feature WebAnalytics known from SharePoint 2010 is no longer part of the product. There is new feature called Analytics .

3 facts why “Usage and Health Data Report” can be empty

·       permission for the database Wss_logging as described in this article: http://social.msdn.microsoft.com/Forums/en-US/sharepointadmin/thread/681b6c53-3969-4984-8a78-4d6e6c55f29c

·       No Crawls: Actual data is processed during crawling. So if there is no crawler scheduled for content sources or there is a problem with the crawlers no actual data is processes.

·       Named Users: Only queries and interaction done by “normal” named users are captured. If you access SharePoint using an administrative account you queries etc. would not be showmen in the reports.

Montag, 28. Mai 2012

Eigene Sortierreihenfolge in der SharePoint Suche


Custom sort option in SharePoint Search Results

Der Post basiert auf einer Frage aus der Community. Ich fand das wäre auch einen Blog Post wert.

Die Frage lautet: Ein Kunde hat folgende Anforderung: das Core-Results WebPart der SharePoint 2010 Suche kann nach Relevanz und Datum sortieren. Die Idee des Kunden ist nun, selbst ein Core Results WebPart zu schreiben um nach weiteren Eigenschaften die Sortierung selbst zu machen. – ist das eine gute Idee? Was ist dabei zu beachten?

Das ist eine gute und viel diskutierte Frage. Es gibt bereits einige Ansätze dazu in Netz, z.B. hier:


oder dieses Webpart:




alles in allem funktioniert das also schon. Mit jQuery kann man da auch handanlegen. Das Problem bei all diesen Lösungen ist, dass die Sortierung im Client gemacht wird. Das kann, je nach Datenmenge, zu einer sehr schlechten Performance führen und belastet eben den Client.

Wenn man solch eine Lösung als ein SharePoint Feature realisiert, dass dann auf dem AppServer läuft, hat man die Last vom Client zumindest schon mal auf einen skalierbaren Server verlegt. Der Punkt ist, dass das Resultset in beiden Fällen erst mal vom SQL Server geladen werden muss, dann umsortiert wird, und dann angezeigt wird. Die Lösung mit einem Feature ist natürlich schon mal besser zu skalieren als eine Lösung im Client.

Die Tatsache, warum die Suche nur die beiden Sortieroptionen „Relevanz“ und „Datum“ anbietet hat natürlich ihren  Grund. Die StoredProc proc_MSS_GetMultipleResults liefert das ganze SQL-seitig nach Relevanz sortiert (Details siehe hier: LINK). Eine nachträgliche Sortierung nach Datum ist einfach zu machen und kostet nicht viel Rechenzeit. Alles andere kann beliebig ausarten, vor allem bei größeren Datenmengen.

Ich plädiere dafür die Sortierung im SQL Server zu machen. Der kann das am besten, und um Längen besser und performanter als .net. Das ist recht einfach zu machen wenn das Property nachdem sortiert werden soll eines der folgenden ist: Rank, Title, Author, Size, Path, Write, HitHighlightedSummary, HitHighlightedProperties. Diese kommen im Result vom SQL Server standardmäßig mit. Der Call der an den SQL Server geht sieht in etwa so aus:

<QueryText language="en-US" type="MSSQLFT">Select PopularSocialTags,Rank, Title, Author, Size, Path, Write, HitHighlightedSummary, HitHighlightedProperties FROM Scope() WHERE FREETEXT(DefaultProperties, '%searchTerm%') ORDER BY "Rank" DESC</QueryText>

ORDER BY ist hier also einfach anzupassen. Wenn es eigene Properties sein sollen wirds komplizierter. Eine gute Möglichkeit damit zu experimentieren ist das Tool: http://fastforsharepoint.codeplex.com/ (heißt zwar FAST, geht aber auch für die normale Suche). Dort kannst das XML, das an den Webservice der Suche geschickt wird, bearbeitet werden und man kann damit experimentieren…

…Der offizielle Weg wäre FAST zu nehmen und ein eignes Rankingprofil zu erstellen.

Montag, 7. Mai 2012

SharePoint Search – a look behind the scene Part IV

Part IV: Last part with a summary and some further issues

In the last parts of the series “SharePoint Search – a look behind the scene” (Part I , Part II , Part III) we have seen how SharePoint interacts with the SQL Server and which data is stored in which SQL Server Database belonging to the Search Service.

In Part III I show where custom managed properties belonging to an object are stored and how SharePoint handles this correlation. As one of the missing parts now let´s have a look at social Tags. The social Tags are stored in the UserProfile and not within the content. So for me it was interesting to see, that in the Search database the social Tags information is stored together with the content information in one table. The Table is the MssDocResults table in the Search_Service_Application_PropertyStoreDB. Folowing link shows a description of the table. http://msdn.microsoft.com/en-us/library/dd773971(v=office.12).aspx  The column PopularSocialTags contains the Tags given from the users.

Following example shows a T-SQL Search for the tag “I like it”:

SELECT * FROM [Search_Service_Application_PropertyStoreDB].[dbo].[MSSDocResults]

WHERE[PopularSocialTags] like '%I like it%'

So what we also see here is that this table is just “de-normalized”. For example the tag “SQL Server” and not only its ID / key is stored in the table MSSDocResults.

This is because of performance issues. Search is read optimized and it’s faster to deliver the data / relation from content and social Tags without doing joins before.
The keyword “Performance” brings me to the next point. During this “look behind” series we see, and that’s also the information SQL reports give us, that some tables are more frequently used and have a bigger impact in the context of search performance.
There is a really good article on which index is frequently used on MSDN. Most frequently used tables by search queries are:
-          dbo.MSSDocProp
-          dbo.MSSDocResults
-          dbo.MSSOrdinal
in the Search_Service_Application_PropertyStoreDB.
So if performance is a bottleneg it can be helpful placing this tables / index on a separate filegroup on a fast disksubsystem.
Another interesting part is the security. In case of a high sensitive environment we have to see clear which data is placed in which database and how it is protected. Communication between SharePoint Application Server and the SQL Server SharePoint Search databases is not really critical. The search requests are compiles in the BLOB Data as shown in Part I. But a maybe critical point is, that the data coming from external systems via BCS are placed clear text in the Search_Service_Application_PropertyStoreDB / table: dbo.MSSDocProps

this can be an issue. Details see Part III.

See the complete post inc. the hands-on lab as webcaste here:

Dienstag, 17. April 2012

SQL Server 2012 Filestream, Filetable and the added value with SharePoint 2010

I wrote about the new SQL Server 2012 Filetable feature based on Filestream, which was already been coming with the last version SQL Server 2008, before (see here: LINK ). Now let’s see Filetable together with SharePoint in action.
First we have to activate Filetable which is not been active per default after installing SQL Server 2012. How to activate this can be found here: LINK .
My settings are:
-          There is a database named “FileStreamDemo”.
-          In the “options” menu you can see that there is a Filestream directory named  “FileStreamData”
-          There is a table called “dbo.FileTableData” which points to the FileStreamData directory:
CREATE TABLE [dbo].[FileTableData] AS FILETABLE ON [PRIMARY] FILESTREAM_ON [FileStreamDemoGroup1]
WITH
(
FILETABLE_DIRECTORY = N'FileTableData', FILETABLE_COLLATE_FILENAME = SQL_Latin1_General_CP1_CI_AS
)
-          NON_TRANSACTED_ACCESS = FULL, that allowed us interact with the data in the filetable directory bypassing the SQL Server API and use the windows explorer direct.
-          The script in the background shows the create script for the database. We can see that the database files *.mdf and *.ldf are placed in a different directory than the Filestream data.
I placed several DEMO files in the Filestram directory:

If we now query the table “dbo.FileTableData” the result is:
The columns are fixed and base on the propertys coming from the NTFS file system:
Column Name
Data Type
stream_id
uniqueidentifier ROWGUIDCOL
file_stream
varbinary(max) FILESTREAM
name
nvarchar(255)
path_locator
hierarchyid
creation_time
datetimeoffset(7)
last_write_time
datetimeoffset(7)
last_access_time
datetimeoffset(7)
is_directory
bit
is_offline
bit
is_hidden
bit
is_readonly
bit
is_archive
bit
is_system
bit
is_temporary
bit

Each FileTable includes also these computed (read-only) columns:
Column Name
Data Type
parent_path_locator
hierarchyid
file_type
nvarchar(255)
cached_file_size
bigint

So we see that the content from FileStream directory is returned as BLOB data placed in the column “file_stream” as varbinary(max). By the way: the data placed in here is automatic part of the backup and restored process.
If we now use BCS to connect SharePoint to this database table we have two issues:
1.       We are not able to bring the content directly to the user without writing code that extracts / transforms the BLOB data.
2.       Using SharePoint to unwrap the BLOB data generates load on our SharePoint. The data must be load from SQL Server / Filesystem to SharePoint and from the SharePoint Server to the Client.
These two issues will have the same answer:
Let’s point to the data in the file-system using the given information. That means only the metinformation where the file is placed is transferred and the client picks up the data directly from where it is stored.

-          In scenario 1 the BLOB data is transferred via SharePoint.
-          Scenario 2 shows the way only the metinformation (blue arrows) is transferred via SharePoint and the user access the content directly.
Scenario 2 is what I will show now. The metinformation where the data is located is not part of the data in the “dbo.FileTableData” table. Also there are some columns we do not need to transfer via BCS to our SharePoint. So let’s write a SQL query which is given as the needed and filtered information:
1.  SELECT
2.  stream_id,
3.  name,
4.  file_type,
5.  cached_file_size,
6.  creation_time,
7.  last_write_time,
8.  last_access_time,
9.  is_directory,
10. 'file:\\DEMO2010A\FileStream\FileStreamData\FileTableData\' + name AS Link
11. FROM dbo.FileTableData

The line 10 brings the “magic” within this. This T-SQL call generates for every entry in the “dbo.FileTableData” table a link to the original data in the Filesystem. This link is attached as a new column called “Link”:
This query is now used to create a view called “FileTableView”.
Next step is now connection this view via BCS to SharePoint. I do this using SharePoint Designer to create a new External Content Type. How to create External Content Types and handling BCS is not part of this post. If you need help within this just contact me or have a look here: LINK .
So here is my solution:
As you can see I created an External Content Source called “FileTableData” which is based on the view “FileTableView”. For this use case I only need a ReadList and a ReadItem operation.
Publishing this External Content Type to a SharePoint External List (and of course doing some XSL customizing) generates a list looking like this which shows exactly the content of the FileStream directory:
Clicking the “Link” given to each entry opens the document / folder. The access happens as shown in PIC 4 directly to where the content is stored.
Resume:
This is an interesting way bringing data into a SharePoint environment without blowing up the content databases of SharePoint. This data can also be protected with SQL Server backup and recovery tools. But of course there a several 3th Party Tools doing this ore similar thing with more features around. Anyway, for me it’s a create feature and an additional option in data management.

See this post as webcaste here: