Asp.net mvc 使用C(ASP.NET MVC)上载csv文件

Asp.net mvc 使用C(ASP.NET MVC)上载csv文件,asp.net-mvc,Asp.net Mvc,我有一个CSV文件,其中包含以下内容: ProductName,EmployeeID,EmployeeName,ContactNo,Adddress iPad,1233,Tom,89897898,34 Pitt st iPad,1573,Jack,8978 9689,50 George st iPad,1893,Peter,8878 8989,32 Martin st 以下代码将插入到一个表中。我试图实现的是在两个表中插入: Product table (parent table) Prod

我有一个CSV文件,其中包含以下内容:

ProductName,EmployeeID,EmployeeName,ContactNo,Adddress
iPad,1233,Tom,89897898,34 Pitt st
iPad,1573,Jack,8978 9689,50 George st
iPad,1893,Peter,8878 8989,32 Martin st 
以下代码将插入到一个表中。我试图实现的是在两个表中插入:

Product table (parent table) ProductId(Pk), ProductName Employee Table (child table) EmployeeId(Pk), ProductId(fk), EmployeeName, ContactNo, Address
首先,您应该将控制器与数据库代码解耦,只需创建一个新的类项目并在其中承载所有数据库访问,这样您就可以在控制器中实现如下功能:

[HttpPost] 公共操作结果上载文件HttpPostedFileBase文件上载 { 如果FileUpload.ContentLength>0{ //有一个文件需要我们注意 var success=db.uploadProductFileUpload; //一切都好吗? 如果成功 返回ViewUploadSuccess; 其他的 返回ViewUploadFail; } 返回RedirectToActionIndex,新建{error=请上传文件…}; } public ActionResult索引字符串错误 { ... } 通过这种方式,控制器并不真正关心您如何处理上传的文件,因为控制器不关心知道这些事情,它的任务是知道它需要委托该工作并处理结果,就是这样

请注意,操作方法称为UploadFile,而不是Index。为了避免用户刷新页面时再次发布,发布到相同的操作不是一个好的做法

我还建议您使用ADO.NET实体模型,ASP.NET网站上也有大量视频,这将极大地帮助您以更简单和干净的方式使用数据库

回到你的问题上来。。。在您的数据库类中,UploadProductFile方法应该类似,并且假设您要处理的记录不超过200条,最好使用,而不是花时间保存并再次读取更多,您应该保存文件并进行处理,就像您已经做过的那样:

私有bool上传ProductFileHttpPostedFileBase文件上传 { //以可读的方式获取文件流 StreamReader reader=新建StreamReaderFileUpload.InputStream; //获取表示传递的字符串的DataTable System.Data.DataTable dt=ProcessCSVreader.ReadToEnd; //对于每一行,编写语句 布尔成功=真; dt.行中的foreach System.Data.DataRow行 success=db.insertprodutineforow; 回归成功; } InsertProdutInfo方法将触发一个类似于:

声明@product\U key int 开始事务 更新[待定产品] 设置[name]=@product\u name,[last\u update]=getdate 其中[名称]=@产品名称; -获取产品id 选择@product_key=[id] 来自[tbl_产品] 其中[名称]=@产品名称; 如果@@rowcount=0 开始 -没有这样的产品,让我们来创造它 插入[tbl_products]名称,最后更新 值@产品名称,getdate; 选择@product\u key=SCOPE\u IDENTITY 终止 -现在我们知道我们已经添加了产品和id,让我们添加其余的 在[tbl_员工]id、产品id、姓名、联系人、地址中插入 值@employee\u id、@product\u key、@employee\u name、, @员工联系人,@employee\u地址; 提交传输
这样你就可以得到你所需要的一切。

你有什么问题吗?您是否在询问如何将数据插入多个表?考虑到aspx文件,这真的是MVC吗。我使用MVC4框架,您可以使用ASPX或razor引擎。是的,我需要在多个表中插入数据。如上所述,我需要使用csv文件中的信息并插入到Product and employee表中。您得到的错误是什么,您的回发是否影响了控制器功能,如果是,它在哪里被卡住?嗨,Niraj,如果我只尝试插入到employee表中,这没有问题。但由于我需要先插入产品表,所以我需要找出如何实现这一点。需要在Product表中插入的信息是Product name,其中Product id自动插入。这是我遇到问题的部分。Spidey,如果您将记录添加到product表中,我无法在代码中看到,我只看到在ProcessBulkCopy方法中将记录添加到Employee表中。您需要首先将记录添加到product表中,除非您已允许employee表的productid列为null。您好,Balexandre,我对您存储的进程有问题。我在上面添加了存储过程。谢谢你,我需要想象一下你的问题是什么?
[HttpPost]
public ActionResult Index(HttpPostedFileBase FileUpload)
{
    // Set up DataTable place holder 

    Guid ProductId= Guid.NewGuid();
    using (SqlConnection conn = new SqlConnection(connString))
    {
        conn.Open();

        using (SqlCommand cmd = new SqlCommand(
               "INSERT INTO Product VALUES(" + "@ReferralListID,  @ProductName)", conn))
        {
            //Note product name need to read from csv file
            cmd.Parameters.AddWithValue("@ProductId", ProductId);
            cmd.Parameters.AddWithValue("@ProductName", ProductName); 

            int rows = cmd.ExecuteNonQuery();

            //rows number of record got inserted
        }
    }

    DataTable dt = new DataTable();

    //check we have a file 
    if (FileUpload.ContentLength > 0)
    {
        //Workout our file path
        string fileName = Path.GetFileName(FileUpload.FileName);
        string path = Path.Combine(Server.MapPath("~/App_Data/uploads"), fileName);

        //Try and upload
        try
        {
            FileUpload.SaveAs(path);
            //Process the CSV file and capture the results to our DataTable place holder
            dt = ProcessCSV(path);

            //Process the DataTable and capture the results to our SQL Bulk copy
            ViewData["Feedback"] = ProcessBulkCopy(dt);
        }
        catch (Exception ex)
        {
            //Catch errors
            ViewData["Feedback"] = ex.Message;
        }
    }
    else
    {
        //Catch errors
        ViewData["Feedback"] = "Please select a file";
    }

    //Tidy up
    dt.Dispose();

    return View("Index", ViewData["Feedback"]);
}

/// <summary>
/// Process the file supplied and process the CSV to a dynamic datatable
/// </summary>
/// <param name="fileName">String</param>
/// <returns>DataTable</returns>
private static DataTable ProcessCSV(string fileName)
{
    //Set up our variables 
    string Feedback = string.Empty;
    string line = string.Empty;
    string[] strArray;  
    DataTable dt = new DataTable();
    DataRow row;

    // work out where we should split on comma, but not in a sentance
    Regex r = new Regex(",(?=(?:[^\"]*\"[^\"]*\")*(?![^\"]*\"))");

    //Set the filename in to our stream
    StreamReader sr = new StreamReader(fileName);

    //Read the first line and split the string at , with our regular express in to an array
    line = sr.ReadLine();
    strArray = r.Split(line);

    //For each item in the new split array, dynamically builds our Data columns. Save us having to worry about it.
    Array.ForEach(strArray, s => dt.Columns.Add(new DataColumn()));


    //Read each line in the CVS file until it's empty
    while ((line = sr.ReadLine()) != null)
    {
        row = dt.NewRow();

        //add our current value to our data row
        row.ItemArray = r.Split(line);
        dt.Rows.Add(row);
    }

    //Tidy Streameader up
    sr.Dispose();

    //return a the new DataTable
    return dt;


}

/// <summary>
/// Take the DataTable and using WriteToServer(DataTable) send it all to the database table "BulkImportDetails" in one go
/// </summary>
/// <param name="dt">DataTable</param>
/// <returns>String</returns>
private static String ProcessBulkCopy(DataTable dt)
{
    string Feedback = string.Empty;
    string connString = ConfigurationManager.ConnectionStrings["DataBaseConnectionString"].ConnectionString;

    //make our connection and dispose at the end    
    using(  SqlConnection conn = new SqlConnection(connString))
    {
        //make our command and dispose at the end
        using (var copy = new SqlBulkCopy(conn))
        {
            //Open our connection
            conn.Open();

            //Set target table and tell the number of rows
            copy.DestinationTableName = "Employee";
            copy.BatchSize = dt.Rows.Count;
            try
            {
                //Send it to the server
                copy.WriteToServer(dt);
                Feedback = "Upload complete";
            }
            catch (Exception ex)
            {
                Feedback = ex.Message;
            }
        }
    }

    return Feedback;
}
<asp:Content ID="Content1" ContentPlaceHolderID="TitleContent" runat="server">
    Home Page
</asp:Content>

<asp:Content ID="Content2" ContentPlaceHolderID="MainContent" runat="server">

    <h2>CSV Bulk Upload</h2> 

    <% using (Html.BeginForm("","",FormMethod.Post, new {enctype="multipart/form-data"})){ %>

        <input type="file" name="FileUpload" />
        <input type="submit" name="Submit" id="Submit" value="Upload" />
    <% } %>

    <p><%= Html.Encode(ViewData["Feedback"]) %></p> 
</asp:Content>
USE [BULkDatabase]
GO


SET ANSI_NULLS ON
GO

SET QUOTED_IDENTIFIER OFF
GO



CREATE PROCEDURE [dbo].[InsertProdutInfo] 
(
   @ProductName varchar (50),  
   @EmployeeName varchar (50),
   @EmployeeAddress varchar (50)
)

AS


BEGIN TRAN

   update [dbo.Product] 
   set    [ProductName] = @ProductName
   where  [ProductName] = @ProductName;

   -- get product id
   select ProductId = [ProductId] 
   from   [dbo.Product]
   where  [ProductName] = @ProductName;  

   if @@rowcount = 0
   BEGIN TRAN

      DECLARE @ProductId uniqueidentifier
      -- there's no such product, let's create it
      insert into [dbo.Product]
      values (NEWID(),@ProductName);

      select @ProductId = SCOPE_IDENTITY()
   end

   -- now that we know we have added the product and have the id, let's add the rest
   insert into [dbo.Employees]
   values (NEWID(), @EmployeeName, @EmployeeAddress, @ProductId);

COMMIT TRAN