Data models are great for several things. The main thing is normalization of data. You can bring in different types of logs with different fields and search them using a set normalized field. For example. 3 different firewall might call the source ip differently. It might be called src_ip, client_ip, source_address. Using a data model you can search this using one standard name src.
... View more
There are a couple of ways of doing this. I would recommend bring in all 90 days then aggregating it. You could use something like bin or bucket. You could also you something like rangemap to break it up into easy to find the 30 days groups easier. Another thing you could do would be something like an eval based on range something like eval 30d=(_time-now()/2678400)
... View more
Be sure to have the correct jdbc drivers and make sure they are installed in the correct location.
http://docs.splunk.com/Documentation/DBX/3.1.3/DeployDBX/Installdatabasedrivers
... View more